
Meta Founder and CEO Mark Zuckerberg was joined by Co-Founder and CEO of Databricks, Ali Ghodsi. YOUTUBE
This is really a big deal for Meta and the AI industry as the maker of the popular open-source Llama LLM seeks to directly monetize the incredible adoption Meta has realized. Developers just access the model from the cloud; no hardware or software to install.
But it is also a big deal for Cerebras and Groq, the two startups selected by Meta for serving fast tokens, many times faster than a GPU. (Nvidia, Cerebras and Groq are all clients of Cambrian-AI Research.) Meta did not disclose pricing, as access to the API is currently in preview, and access to Groq and Cerebras is only available by request. This is the first time either startup has landed a foothold at a hyper-scale Cloud Service Provider (CSP). And Meta has made it super easy to use; developers just select Groq or Cerebras in the API call.

Cerebras is the industry’s fastest inference processor (20X) by far but Grok is some 5-fold faster than any GPU. CEREBRAS
“Cerebras is proud to make Llama API the fastest inference API in the world,” said Andrew Feldman, CEO and co-founder of Cerebras. “Developers building agentic and real-time apps need speed. With Cerebras on Llama API, they can build AI systems that are fundamentally out of reach for leading GPU-based inference clouds.”

Llama on Cerebras is far faster than on Google TPUs or Nvidia GPUs. CEREBRAS
Andrew’s point is important. Obtaining inferences at some 100 tokens per second is faster than a human can read, so “one-shot” inference requests for a service like ChatGPT runs just fine on GPUs. But multi-model agents and reasoning models can increase computational requirements by some 100-fold, opening an opportunity for faster inference from companies like Cerebras, Groq. Meta did not mention the third fast-inference company, Samba Nova, but indicated that they are open to other compute options in the future.
It will be interesting to see how well these two new options fare in the tokens-as-a-service world.