The Latest News in AI

We publish news articles on Forbes, which are copied here for your convenience.  

BrainChip Sees Gold In Sequential Data Analysis At The Edge

Unlike in image processing or large language models, few AI startups are focused on sequential data processing, which includes video processing and time-series analysis. BrainChip is just fine with that. With all the buzz around LLM generative AI, it is understandable...

NVIDIA GTC: “DPU” Smart NIC And More

NVIDIA Co-founder and CEO Jensen Huang rarely disappoints his audience nor his investors. This week he once again delivered the goods at the GPU Technology Conference. Announcing a broad range of hardware and software innovations, Jensen made it clear that he intends...

IBM Research and NeuReality Announce Partnership For AI

NeuReality is the first Licensee of IBM’s reduced-precision core for AI IBM (NYSE: IBM) and NeuReality, an Israeli AI systems and semiconductor company, have signed an agreement to develop the next generation of high-performance AI inference platforms. IBM and...

New Fabrics Enable Efficient AI Acceleration

While GPU performance has been the focus in data centers over the last few years, the performance of fabrics has become a key enabler or bottleneck in achieving the throughput and latency required to create and deliver artificial intelligence at scale. Nvidia’s...

How AI Changes The Role Of Memory Companies And Their Value

As GPU’s become a bigger part of data center spend, the companies that provide the HBM memory needed to make them sing are benefitting tremendously. AI system performance is highly dependent on memory capacity, bandwidth and latency. Consequently, memory technology is...

Synopsys Moves To RISC-V To Help SoC Developers

When the number two provider of CPU designs jumps on the RISC-V train, it is a significant milestone. The open-source RISC-V design is on a roll, displacing Arm in many SoC development plans. ARC and Arm are both companies that design and license microprocessor (CPU)...

The NewReality: Fast Inference Processing For 90% Less?

Most of the investment buzz in AI hardware concentrates on the amazing accelerator chips that crunch the math required for neural networks, like Nvidia’s GPUs. But what about the rest of the story? CPUs and NICs that pre- and post-process the query add significant...

Esperanto Sees A Bright Future For RISC-V In AI And HPC

The company is shipping its first-gen chip globally, with over 1000 cores at only 25 watts of power. Can it break into Generative AI? Suddenly, AI has become the hottest investment and cocktail party topic de jour. But the estimates for power consumption are pretty...

NVIDIA Needed A CPU, But Did It Need To Buy Arm To Get One?

I often opine that NVIDIA needs a data center-class CPU to compete with Intel and AMD, both of whom have used tightly-coupled CPU/GPU technology to win the first three U.S. exascale supercomputer deals. Connecting massive GPUs to fast CPUs over a painfully slow PCIe...

Who Has The Fastest AI Inference, And Why Does It Matter?

A food fight erupted at the AI HW Summit earlier this year, where three companies all claimed to offer the fastest AI processing. All were faster than GPUs. Now Cerebras has claimed insanely fast AI performance with their latest software running on the company’s...