The Latest News in AI

We publish news articles on Forbes, which are copied here for your convenience.  

My 2026 AI Predictions Have A Few Surprises

OK, I haven’t done this in a while; no excuse other than laziness. But here are ten concrete, defensible predictions for AI in 2026, with a bias toward things that materially matter for infra, enterprises, and policy. 1. Agentic AI moves from demos to staffed “digital...

read more

AI Training: “I’m Not Dead Yet!”

With so much focus on inference processing, it is easy to overlook the AI training market, which continues to drive gigawatts of AI computing capacity. The latest benchmarks show that the training of AI models, an immense investment in power and compute, continues to...

read more

AI Hardware: Harder Than It Looks

The second AI HW Summit took place in the heart of Silicon Valley on September 17-18, with nearly fifty speakers presenting to over 500 attendees (almost twice the size of last year’s inaugural audience). While I cannot possibly cover all the interesting companies on...

NVIDIA Outperforms Itself Once Again

When the only company you can beat is yourself, what do you do? After all, you are the leader, right? And nobody else is even suited up and on the court. Some companies might rest on their laurels, saving money and slowing down R&D. NVIDIA is not one of those...

AMD Launches New GPU And EPYC CPU Right Across NVIDIA’s Bow

The Instinct MI200 is nearly five times faster than the NVIDIA A100 for HPC, but is theoretically only 20% faster for AI. One year ago I complained that the newly announced AMD MI100 GPU was great for HPC, but inadequate for most AI workloads. Now AMD has announced...

The Graphcore Data Center Architecture

The Graphcore disaggregated accelerator could be a game changer. I have recently finished a research paper looking into the data center architecture for deploying the Graphcore IPU-Machine, which is a network-attached accelerator for highly-parallel workloads. Lets...

Cadence AI Can Increase Chip Design Quality And Chip Designer Productivity Over 1000%

EDA company’s clients have finished hundreds of new tapeouts using Cadence Cerebrus AI to speed development and make chips that run faster, use less energy, and cost less. For Cadence, AI is all about increasing engineering teams’ productivity, speeding higher...

Big AI Inference Has Become A Big Deal And A Bigger Business

Thanks to innovations like DeepSeek, training AI has become cheaper. However, inference is becoming more demanding as we ask AI to think harder before answering our questions. Nvidia, Groq, and Cerebras Systems (clients of Cambrian-AI Research) have all released...

Don’t Trust AI? NVIDIA Guardrails May Lower Your Anxiety, And Save Your Job.

A new Nemo Open-Source toolkit allow engineers to easily build a front-end to any Large Language Model to control topic range, safety, and security. We’ve all read about or experienced the major issue of the day when it comes to large language models (LLMs). “My LLM...

Synopsys Lays Out The Benefits Of AI

At the annual Synopsys User Group Meeting, CEO Aart de Geuss explains the industry dynamics in an engineering world transformed by AI. I won’t start this blog by talking about Chat GPT. You’re tired of hearing about it, right? But the same technology (reinforcement...

Mythic Launches Industry First Analog AI Chip

Mythic claims 400 TOPS with 1.28B parameters on 16-chip PCIe Card Please welcome new Cambrian-AI Analyst Gary Fritz, who contributed to this article. Artificial Intelligence applications are starting to show up in everything from cell phones to supertankers. But at...

The Cambrian AI Landscape: NVIDIA, The 800Lb Gorilla

NVIDIA is the 800 lb. gorilla in the AI hardware space. The company invests heavily in silicon, systems, and software to maintain that leadership position. This blog is an excerpt from the AI Competitive Landscape Report which is available to subscribers. Early...