Verne Global’s Dominic Ward discusses the sustainable data gravity paradox and how it could be flipped by utilising the mobile nature of data.
Data gravity refers to the concept that with sufficient scale, a given mass of data attracts more data, which in turn pulls applications and workflows with it. The greater the mass of that data, the more gravitational pull it will have.
The term data gravity was first coined by software engineer Dave McCrory, who described the gravitational pull that large masses of data seem to exert on IT systems, drawing an analogy to physics, which posits that objects with sufficient mass will pull objects with less mass towards them.
To some extent, this has been at the centre of the model by which all digital infrastructure has developed over decades. Historically, datasets were monolithic, often created from one source, stored in one location and processed in a linear manner.
Data and the infrastructure supporting it grew for decades in centralised locations, often dictated by proximity to internet exchange connectivity.
Today, data creation is dynamic, growing at a scale that is hard to contemplate, is omnipresent and perhaps most importantly is incredibly valuable. The scale and volume of datasets continues to grow exponentially with two-thirds of the world’s data created in the last three years.
However, the more data we create, the greater the mass, the greater the gravity and the harder it becomes to escape the gravitational pull of that dataset. This is now creating significant challenges.
The data gravity problem
The historic proximity requirements of the digital infrastructure industry and the data gravity that has resulted has created an over-concentration of the digital infrastructure industry in metro locations such as London, New York, Dublin, Frankfurt, Ashburn and Amsterdam.
Some of these locations can no longer support the current, let alone future, growth being drawn to them. As data gravitates to those concentrated areas, the physical infrastructure of land, buildings, water and most importantly power, cannot keep up.
Current estimates suggest that data centres in Dublin will soon be consuming 20pc of the city’s power, which is 10 times the estimated global average and up from closer to 5pc only seven years ago.
This enormous growth has largely been driven by hyperscale cloud operators such as Microsoft, Google and AWS expanding their data centre footprints in a location in which they have been established for decades.
Even worse, the cloud operators unnaturally increase the strength of data gravity with immensely high data egress costs from their platforms. This is clearly unsustainable in more ways than one.
Not only must there be a finite level of resources available for this type of infrastructure at some point in time, but Ireland’s energy production relies heavily on a predominantly carbon-generating grid, meaning that its data centre industry is far from green.
Availability of power and infrastructure is one factor, as is the economics of the digital infrastructure industry, particularly the data centre industry, which has also changed significantly.
Due to recent events, the power prices in London and Frankfurt have risen dramatically, increasing the cost of operating data centres enormously. This has a knock-on effect for the economic viability of storing and processing data in these historically gravitationally dense locations.
Does it still make sense for a German company to process all of its applications and data in a data centre in Frankfurt if the price to do so has risen five times in the last 12 months?
Overcoming data inertia
Data gravity creates challenges. As datasets grow and the gravitational pull of those datasets increases, more concentration and pressure has been put on the historically centralised digital infrastructure locations.
Not only are those locations running at or beyond capacity, they are also not sustainable. Therein lies the sustainable data gravity paradox.
The greater the data gravity of a location, the more pressure is placed on its finite resources under the current model. Something has to change, and it can.
Some data is latency sensitive, such as in high frequency trading, for example. Some must reside in a specific country for data privacy regulations. Other data must be hyper-connected, as is the case for content distribution.
For these types of data, there is likely a strong need for it to be located, stored and processed in a specific location. Cost, efficiency and sustainability may play a less important role in the location of that data as a result.
However, what about data that is less latency sensitive, does not have specific data privacy requirements or need to be hyper-connected? Should that data also reside in resource-constrained, expensive, inefficient and less sustainable locations? One would hope the answer is no.
The fact is, if your data or applications can sit in a cloud environment, it can probably sit anywhere, as you rarely have control or knowledge over where your data resides in a cloud environment.
Data gravity can be strong, but what if the gravitational pull of lower cost, more efficient, more sustainable digital infrastructure pulled harder than the gravitational pull of the data itself?
Data is by nature inherently mobile. Terrestrial, wireless and subsea networks transport massive, almost unimaginable, volumes of data around the world at increasingly fast and lower cost rates. It is possible and it is time to break the sustainable data gravity paradox.
Sustainability is finally getting the attention it deserves and data centre customers are driving demand for sustainable solutions.
Governments need to protect their resources, which are limited and hard to expand in the short and medium term, and allocate them on a fair and reasonable basis.
Ultimately supply and demand will dictate choice and when a resource like power has excess demand, prices continue to rise and put strain on or even limit other uses.
While power is more readily available, at lower cost and is truly sustainable in locations like Iceland (which is only 10 milliseconds away via a subsea fibre link), a potential way for governments to influence demand is by requiring more transparent reporting that unmasks the ‘greenwashing effect’.
By Dominic Ward
Dominic Ward is the CEO of Verne Global, a UK-headquartered data centre operator with a data centre campus in Iceland. A version of this article appeared on Verne Global’s blog.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.