OpenAI agrees bn AI infrastructure deal with start-up Cerebras – Financial Times

OpenAI agrees $10bn AI infrastructure deal with start-up Cerebras – Financial Times

OpenAI agrees $10bn AI infrastructure deal with start-up Cerebras – Financial Times
Cerebras Systems says its technology enables AI ‘inference’ at speeds dramatically faster than market-leading graphics processing units © via Reuters

OpenAI has struck a $10bn multiyear agreement with Cerebras Systems to source computing infrastructure from the chip start-up, which is trying to rival the dominant chips group Nvidia.

The deal, which will run until 2028, involves 750 megawatts of computing power, the companies announced on Wednesday — enough energy to power a major US city.

It marks the ChatGPT maker’s latest move to secure massive amounts of computing hardware to run its AI models and to diversify its supply of chips by cutting deals with smaller players.

Cerebras, valued at $8.1bn late last year, offers a cloud platform underpinned by specialised chips the size of a dinner plate.

The start-up says its technology enables AI “inference” — the process by which models respond to queries — at speeds dramatically faster than market-leading graphics processing units supplied by AI chip titans such as Nvidia and AMD.

France’s Mistral uses Cerebras’ chips for its chatbot, Le Chat.

In recent months, OpenAI has moved to diversify its AI hardware suppliers, inking a major deal with AMD last year and preparing to launch its own AI chips designed with Broadcom this year.

OpenAI has made commitments totalling about $1.5tn over the next decade or so to partners providing the infrastructure to train and run its AI models.

Sam Altman, the start-up’s chief executive, has argued access to computing power, cutting-edge chips and other components will provide his company with an advantage over rivals including Google and Meta.

But the commitments dwarf OpenAI’s revenues, currently about $20bn on an annualised basis, said people familiar with the matter, and the ChatGPT maker remains lossmaking.

Inference is seen as a crucial driver of AI computing demand in the coming years. Nvidia in December scooped up talent from competitor Groq, which also focuses on chips that offer faster response times.

“OpenAI’s compute strategy is to build a resilient portfolio that matches the right systems to the right workloads,” said Sachin Katti, OpenAI’s head of infrastructure.

Cerebras’ technology “means faster responses” and “more natural interactions” from AI models, said Katti, who left Intel to join OpenAI in November.

Cerebras raised $1.1bn in September from investors including Fidelity and Donald Trump Jr’s firm 1789 Capital, as it prepares for an initial public offering.

OpenAI has raised about $60bn to fund its plans and struck novel and sometimes circular arrangements with chip partners including AMD and Nvidia.

OpenAI is also in discussions with investors about a new funding round which could raise as much as $80bn and value it at more than $800bn, said multiple people with knowledge of the talks.

Similar Posts