The Monolithic Database in a Hybrid World
For a number of years now, software vendors have been working at modularizing their monolithic applications with the intent of providing consumers the opportunity to mix and match best-of-breed functionality without being tied to one specific package. The interaction between ERP and CRM is a prime example of this.
Looking at the Microsoft offerings in this space, we see Dynamics AX and Dynamics CRM dominating the market. AX does include CRM functionality; however, it is not considered to be as robust as Dynamics CRM. Many customers prefer Dynamics CRM’s inherent online functionality and tight integration with Outlook, something that AX will likely never have. So when a business requires ERP and CRM functionality, they will generally invest in both products to provide the best experience to their user base.
Although both AX and CRM could be considered monolithic applications, each offering a full suite of functionality, the actual implementation of each package can be very modular. For example, a number of companies implement AX strictly for financials, leaving the rest of the functionality unused.
This works very well, with one (very big) exception: the database. For both systems, regardless of what functionality is used, there is a certain base amount of data that needs to be maintained. In this regard the databases can also be considered monolithic.
In traditional deployments, this was never really a big issue because the software was likely running in the same data center and probably on the same server. So although integration may have been required, distance was not a factor. From within AX, we could write code that could access the CRM database via ODBC to provide a real time view into CRM activities. We could also link the databases within SQL server if they were running on different servers, streamlining the access. In a hybrid world, the databases could be hundreds of miles apart, and the latency alone would be enough to frustrate end users waiting for a data pull.
The real solution to this is for application developers to take database interactions into account as they are writing software, as the alternative can and will cause significant issues as companies move towards a hybrid world. It can no longer be assumed that databases will be running on the same server, or even in the same LAN. This makes it so that real-time database sharing methods may no longer be realistic.
Ok, we’ve dealt with this issue before, where we have disparate systems that need to share data; however, all of those solutions are built around the monolithic architecture where the data must reside completely within both systems. Anyone who has worked with integration software knows that it can be a real headache. Like it or not, for the short term at least, integration may be the only solution.
Even with the recent announcement of the Microsoft Dynamics 365 offerings, the applications are not truly sharing a common database. Each component is still a stand-alone application, but what Dynamics 365 offers is a better integration experience.
By providing the Common Data Model (CDM) platform, Microsoft is facilitating the data exchange between the monolithic databases. CDM is not just available for out-of-the-box AX and CRM. It can be expanded to include custom entities and even other databases.
CDM, and the companion app Flow, are not the first integration offering from Microsoft; however they are the only cloud-first offering. SQL Server Master Data Services offers a similar level of integration; it is just managed and deployed differently, and therefore requires a different skillset.
Microsoft is also not the only vendor offering these tools. Other contenders, such as Atlas, have been around for years and offer many out-of-the-box connectors.
Ideally, as development efforts around cloud offering mature, we will start to see more and more native integration options, which could easily evolve into something as complete as a software-defined database. Although this would not resolve the latency issue, it would make integration a lot easier as software could consume data natively from the source without having to worry about duplicating adds, changes and deletes across multiple instances of the truth.
Until then, data imports, exports and transformations are going to become the norm for companies who are not ready to shift their entire application portfolio to the cloud at the same time. It will also be required for companies looking to take advantage of a globally-available offering, such as Azure IaaS. If a company has a significant sales presence in 2 or more countries, wouldn’t it make sense to have data center presence in those countries to alleviate latency and networking costs? The monolithic database becomes the limiting factor in this kind of deployment.
Any company developing a cloud migration strategy needs to include database dependencies in their workload selection criteria. Data is not always thought of until after the fact and can cause a significant delay in the project as workarounds/solutions are put in place. Selecting the data integration solution(s) ahead of time and determining where that skillset will reside – either with internal experts or with a partner/solution provider – will greatly improve the success rate of any cloud migration strategy.