Jeff Fried of InterSystems discusses dealing with the challenges of the data-driven age.
Traditional database architectures are struggling to cope in today’s data-driven world. Data volumes are escalating and data coming into organisations often lack a clearly defined structure. Some is event-driven from operational systems, and some transactional from back-office systems.
Most businesses also have large volumes of historical or reference data, often stored in silos and held away from current or newly ingested stores. Much is also retained across multiple different applications, from marketing mailing lists to accounting payroll programs.
Sometimes, data governance can be restrictive and so it is difficult for organisations to get a comprehensive picture of what they hold. Coupled with all this, system configurations are becoming increasingly complex, as organisations try to shoehorn traditional architectures into these dynamic and demanding situations.
The resulting cost and complexity is an inhibitor to businesses making good use of their data. This makes it imperative that organisations deploy the right kinds of data platform architectures going forward.
Surveys highlight database challenges
As part of a research study commissioned by InterSystems, analyst firm Enterprise Strategy Group (ESG) surveyed more than 350 IT and business professionals across enterprise and mid-market organisations – familiar with their organisations’ current database environment – and asked them about the top challenges they face with their current database deployments and infrastructure. The top two most referenced challenges were managing data growth and database size (48pc) and meeting database performance requirements (35pc).
Adding to the complexity is the sheer number of individual databases in place within many organisations. In fact, 38pc of respondents of the ESG research reported that they had between 25 and 100 unique database instances, while another 20pc had more than 100.
Businesses often also have databases of widely different ages. It is not unusual, for example, for them to have flat files that are 20 years old, relational databases that are 15 years old and document databases created yesterday within the same environment. Therefore, it’s not surprising that organisations may be struggling to access and process data in real time to drive strategic business decisions.
More than 75pc of executives responding to a recent survey conducted by IDC in partnership with InterSystems agree that untimely data has inhibited business opportunities. This lag is also slowing the pace of business, the survey finds, with 54pc stating that it limits operational efficiency and 27pc saying it has negatively affected productivity and agility.
Of course, all these challenges come at a time when businesses are digitalising their operations, wanting to transition to the cloud and to harness the latest innovations, including business analytics, machine learning and artificial intelligence (AI).
Scoping out a data management solution
So, how can organisations address these issues? The challenge typically starts with data ingest. Organisations need to find data platforms that can ingest data from real-time activity, transactional activity and document databases.
Platforms must also be able to take on data of different types, from different environments and of different ages to normalise and make sense of it. Interoperability is key here. Any chosen solution needs to be able to ‘touch’ those disparate databases and silos, bring information back, and make sense of it in real time.
Data platforms must also be agile. Organisations need to store data where it is needed and make certain it remains accessible. Businesses must always ensure they can separate out the data they want from any application, from the data they don’t need.
It is also the case that as businesses move systems and applications into the cloud, they are starting to use software to ‘containerise’ their applications and modules. Once containers have been set up in the cloud, they are then reusable by other applications within the suite. It is an increasingly straightforward process but, to deliver true business advantage, there must also be a focus on implementation speed.
Data platforms must deliver high-quality business analytics to drive competitive edge. They need to integrate business intelligence, predictive analytics, distributed big-data processing, real-time analytics and machine learning. They must analyse real-time and batch data simultaneously at scale, allowing developers to embed analytic processing into business processes and transactional applications, enabling programmatic decisions based on real-time analysis.
The challenges around data growth are clear. In addressing these, organisations should look for unified data platforms that can integrate different types of data from multiple sources with real-time and batch analytics capable of driving enhanced business insight and decision-making. In the new data-driven age, we are seeing the dawning of a new era of the agile, flexible data platform.
By Jeff Fried
Jeff Fried is the director of product management at InterSystems, where he is involved with everything from big data and enterprise search to product development and strategic marketing.