Bristol-based AI chipmaker Graphcore raises $150m

28 Feb 2020

From left: Graphcore CEO Nigel Toon and CTO Simon Knowles. Image: Graphcore

Graphcore has raised $150m for its specialised AI chips that are designed to support new breakthroughs in machine intelligence.

This week, it emerged that Bristol-based AI chipmaker Graphcore has raised $150m in funding, extending the company’s Series D round that began in December 2018 and bringing the company’s valuation to approximately $1.95bn.

Graphcore’s list of investors includes some big names such as Amadeus Capital Partners, Atomico, BMW iVentures, Bosch, Dell Technologies Capital, Draper Esprit, Microsoft, the Samsung Catalyst Fund and Sequoia, among others.

As global demand for chips increases, with Deloitte recently reporting on the particular sense of urgency behind developing and manufacturing AI chips, Graphcore has raised a total of $450m in funding and told TechCrunch that it has $300m in cash reserves.

TechCrunch said this is “an important detail considering the doldrums that have plagued the chipmaking market in the last few months, and could become exacerbated now with the slowdown in production due to the coronavirus outbreak”.

Graphcore’s IPUs

The UK company has developed an intelligence processing unit (IPU), which is designed to support new breakthroughs in machine intelligence. Paired with the company’s Poplar software stack, the IPU aims to offer developers a “powerful, efficient, scalable and high performance solution”, which could enable new innovations in AI.

On Graphcore’s website, the company says that the IPU has been designed to support complex data access efficiently and at much higher speeds than was previously possible. The company also claims that its high-performance training and low-latency inference on the IPU’s hardware can improve utilisation and flexibility in the cloud and on-premise.

“The IPU is designed to scale,” according to Graphcore. “Models are getting larger and demand for AI compute is scaling exponentially. High bandwidth IPU-links allow multiple IPUs to be clustered, supporting huge models.

“Legacy architectures struggle on non-aligned and sparse data accesses. The IPU has been designed to support complex data access efficiently and at much higher speeds, which will be critical to run gigantic, next-generation models efficiently.”

While Graphcore claims to be the first company that has developed a processor designed specifically for AI, chipmakers such as Nvidia, Intel and AMD have been making significant investments in this area to meet the surging consumer demand in recent years.

In a statement, Graphcore’s founder and CEO, Nigel Toon, said that demand for IPU products is increasing with both new and existing customers.

“The major investments that we have made during 2018 and 2019 will help us to meet this strong demand by extending the capabilities of our technology and ecosystem, and will support long-term revenue growth and returns for our investors,” he added.

Partnership with Microsoft

Toon recently told TechCrunch that 2019 was a “transformative year for Graphcore”, moving from development to a commercial business.

“We were pleased to announce our close partnership with Microsoft in November 2019, jointly announcing IPU availability for external customers on the Azure Cloud, as well as for use by Microsoft internal AI initiatives,” he added.

At the time, corporate vice-president of Azure Compute at Microsoft, Girish Bablani, said: “Natural language processing models are hugely important to Microsoft – to run our internal AI workloads and for our AI customers on Microsoft Azure.

“We are extremely excited by the potential that this new collaboration on processors with Graphcore will deliver for our customers. The Graphcore offering extends Azure’s capabilities and our efforts here form part of our strategy to ensure that Azure remains the best cloud for AI.”

Kelly Earley was a journalist with Silicon Republic

editorial@siliconrepublic.com