Behold, The Machine: The largest single-memory computer ever built

16 May 201727 Shares

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

Still from ‘The Computer Built for the Era of Big Data’. Image: Hewlett Packard Enterprise/YouTube

Share on FacebookTweet about this on TwitterShare on LinkedInShare on Google+Pin on PinterestShare on RedditEmail this to someone

While quantum is seen as the future of computing, HPE has unveiled the largest single-memory computer ever built, with 160TB of memory.

Data is the new currency of the modern world, but processing vast quantities of data – produced by everything from our smartphones to the widespread adoption of autonomous vehicles – requires computers of enormous power.

One of the leading areas of computer science in the past few years has been quantum computing.

The only problem is that while significant progress has been made towards a science that would far surpass existing computer power, it remains largely within the experimental stage of development.

However, one computer that could tide us over until then has been unveiled by Hewlett Packard Enterprise (HPE).

Part of a research project referred to as ‘The Machine’, the supercomputer is the largest single-memory computer ever built. It is designed specifically to handle the sheer volume of information that is, and will be, created within the internet of things.

This new prototype contains 160TB of memory, allowing it to handle the same amount of information found in 160m books.

Until now, it had never been possible to hold and manipulate whole data sets of this size in a single-memory system. This will be crucial to perform HPE’s plan for a custom-built architecture for an era of big data.

‘Nearly limitless pool of memory’

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of HPE.

“To realise this promise, we can’t rely on the technologies of the past. We need a computer built for the big data era.”

More importantly, HPE said The Machine project is scalable to the point of a “nearly limitless pool of memory”, amounting to 4,096 yottabytes, or 4,096,000,000,000,000,000,000,000 bytes.

To put this into further context, this would amount to 250,000 times the digital data that exists today.

As HPE points out, such an amount of memory would make it possible to simultaneously work with every digital health record of every person on earth, every piece of data from Facebook, every trip with Google’s autonomous vehicles and every dataset from space exploration.

“We believe memory-driven computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO of HPE and director of HP Labs. “The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”

Colm Gorey is a journalist with Siliconrepublic.com

editorial@siliconrepublic.com