
An illustration photo shows Cerebras logo in a smartphone. CFOTO/FUTURE PUBLISHING VIA GETTY
Cerebras, famous for being the only AI company with a full wafer-scale chip, has landed OpenAI, its first major US-based hyperscaler. Prior to this deal, Cerebras has been successful securing investments and system commitments from a relatively small number of customers, notably G42, the Abu Dhabi, UAE, AI company. Sam Altman, CEO of OpenAI, was an early investor in Cerebras. Cerebras has been touting its leadership in AI inferencing performance for the last couple years, and it looks like this approach is working. (Like most AI semiconductor companies, Cerebras is a client of Cambrian-AI Research, LLC.)
OpenAI has partnered with Cerebras Systems to add 750MW of AI compute to OpenAI’s inference-as-a-service platform. The multi-year, multi-billion-dollar agreement brings Cerebras capacity and speed directly into OpenAI’s inference stack, rolling out in phases through 2028.
Its a Big Chunk
Lets do some math. The CS‑3 peak power is commonly cited around ~23 kW per system, and the estimated CS‑3 system price is often cited around $2M–$3M per system. Let’s conservatively assume that Cerebras may consume 50% of the 750MW of power, so 750 MW*50% / 23 kW ≈ ~16,000 CS‑3 systems. 16,000 × $2M ≈ $32B, three times higher than the $10B often estimated for this deal. Note that Cerebras is no-doubt working on the CS-4, which is probably what the bulk of this deal is for and could consume less power.

In a 2023 photograph, CEO Andrew Feldman stands atop crates of Cerebras systems. CEREBRAS
The timing couldn’t be better for CEO/Founder Andrew Feldman and his team at Cerebras. Cerebras Systems is in discussions to raise another $1B that would value the maker of chips for artificial intelligence at $22B, according to The Information. And the company is planning on an IPO soon. No doubt, the OpenAI deal could significantly increase the valuation.
Some Context
OpenAI has been in a world of hurt since the year began, starting with the reports that tout the superiority of Google Gemini3, and that now Apple has chosen Google as its partner for AI, including for the company’s promised revamping of Apple’s Siri assistant. Selecting the Cerebras platform, the fastest in the world, may be a path to counter these competitive threats that OpenAI is facing.
“OpenAI’s compute strategy is to build a resilient portfolio that matches the right systems to the right workloads,” OpenAI’s Sachin Katti said. “Cerebras adds a dedicated low-latency inference solution to our platform, and that means faster responses, more natural interactions, and a stronger foundation to scale real-time AI to many more people.” Sounds like a competitive advantage that OpenAI could use right about now.
Cerebras builds purpose-built AI systems designed for speed — combining massive compute, memory, and bandwidth on a single wafer-scale chip (WSE-3) to eliminate the bottlenecks that slow inference on conventional GPU hardware. This approach results in dramatically faster responses and does so at scale. The wafer’s chips communicate with each other over a network that spans all the chips on the wafer, eliminating the bottlenecks and costs that server-to-server or GPU-to-GPU AI must incur to effectively reattach the chips that were cut up from the original wafer.

The Wafer Scale Engine, unique to Cerebras. CEREBRAS SYSTEMS, INC.
According to Greg Brockman, OpenAI’s co-founder and president, the partnership will make ChatGPT not just the most capable, but the fastest AI platform in the world.
The collaboration builds on a long relationship. Cerebras has worked with OpenAI since 2017 and recently supported OpenAI’s GPT-OSS-120B at 3,045 tokens/second – 15x faster than the leading GPU cloud.
As Andrew Feldman, Cerebras CEO and co-founder, puts it: “Just as broadband transformed the internet, real-time inference will transform AI — enabling entirely new ways to build and interact with models.”
Next Steps for Cerebras
Many of us analysts wondered why Nvidia acqui-hired Groq, and why Intel is in talks to purchase SambaNova, the other two large-scale data center AI startups. We covered these three in an article last year, and concluded that Cerebras had the most significant differentiation and promise.
The answer now is clear; Cerebras is prepared to take on Nvidia and everyone else as a public standalone company. Unless somebody else steps up with a lot of cash. And soon!
Disclosures: This article expresses the opinions of the author and is not to be taken as advice to purchase from or invest in the companies mentioned. My firm, Cambrian-AI Research, is fortunate to have many semiconductor firms as our clients, including Baya Systems BrainChip, Cadence, Cerebras Systems, D-Matrix, Esperanto, Flex, Groq, IBM, Intel, Micron, NVIDIA, Qualcomm, Graphcore, SImA.ai, Synopsys, Tenstorrent, Ventana Microsystems, and scores of investors. I have no investment positions in any of the companies mentioned in this article.