Quality over quantity: Unleashing the power of small data


14 Jul 2022

Chris Wilson, InterSystems. Image: James Pike

InterSystems’ Chris Wilson explains why bigger is not always better when it comes to data.

The advent of the cloud has made businesses greedy for more data. While data is certainly instrumental to innovation, many organisations have lost sight of the fact that quantity isn’t everything. In fact, large numbers of businesses are collecting far more data than is truly needed to uncover the core insights they desire.

The impacts of this are manifold. Not only is it driving up costs from having to store and process such large volumes of data, but it can also be difficult to source the data, which can be vast and disorganised, and result in an ever-growing number of silos.

Additionally, once they have sourced the data, many organisations find that they can’t actually use that much of it, or at least not effectively, owing to a lack of context and biases, for example.

Beyond this, having so much data to sift through can ultimately be a distraction. After all, it can take significant amounts of time to dig into the data to get the answers they want and therefore impact a firm’s ability to be agile.

Consequently, firms need to take a new approach to avoid becoming overwhelmed by too much, often poor quality, data.

For financial services organisations, for example, they have to understand exactly what the business is trying to achieve and the subsequent questions they need to ask of their data and find a way to answer them with the smallest possible dataset.

In a world where bigger is often thought of as better, the power of small data – the minimum viable dataset required for the task in hand – shouldn’t be underestimated. In any case, it takes the right data to produce the right business results.

The fundamentals of small data

So, what exactly do we mean by ‘the minimum viable dataset’? In short, it refers to the smallest possible amount of data that is needed to enable an organisation to act effectively, such as to power any models that have been designed.

When it comes to defining just how much data that actually is, a focus should be on what the business needs and obtaining clean, high-quality data. The importance of high-quality data can’t be underestimated, with quality trumping quantity.

One example of this in practice is if a bank were to begin selling green mortgages, it first needs to understand who to target and how. For this, they would require access to a small amount of high-quality data that gives them the answers to those questions, helping them to determine which individuals have an environmental need or leaning and would therefore be likely to take out a green mortgage rather than a traditional one.

Firms must also determine whether they have the data they need or whether they would benefit from integrating data from third parties. For many firms, a lack of data integration capabilities has historically made integrating external data sources challenging and resulted in the creation of additional data swamps.

However, modern data architectures like connective tissue are primed to help them to overcome this issue. Otherwise known as a smart data fabric, connective tissue speeds and simplifies access to data assets across the entire business.

It accesses, transforms and harmonises data from multiple sources, on demand, to make it usable and actionable for a wide variety of business applications. Meanwhile, embedded analytics capabilities like machine learning and artificial intelligence help firms to derive greater insights from small data, and to do so in real time.

Small data, big results

Adopting a small data approach offers financial services firms a wide range of benefits, including risk reduction. This is because working with a smaller amount of much more relevant real-time data will enable financial services firms to gather critical insights much more quickly than if they were having to sift through large volumes of data and could empower them to spot any potential risks faster.

For instance, if a banker is focused solely on selling mortgages, seeing every single customer transaction will take up precious time and garner few results. Conversely, by being able to quickly access only the most relevant transactions and data, they can make more informed decisions around who to offer mortgages to and the conditions they should be offering, thereby reducing risk.

This approach therefore allows users to get access to the data and insights quicker, reduces cost by reducing the amount of data needing to be stored and analysed and increases agility. Furthermore, handling less data reduces risk around GDPR and data governance, as well as the risk of data theft.

Quality over quantity

Ultimately, where data is concerned, bigger isn’t always better. Instead, what matters is having access to high-quality, relevant data to power the initiatives that truly matter to the organisation.

By adopting a small data approach and arming themselves with the right data infrastructure, firms will be able to do much more with less, gaining more intelligent and relevant insights that will allow them to increase efficiency, reduce risk and become more agile, while also obtaining a 360-degree view of their business.

This will not only empower those on the frontline to make more informed, accurate decisions, but it will also better serve the business leaders, who can gain greater insights at a glance.

By Chris Wilson

Chris Wilson is the sales director at data management company InterSystems.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.