Andrew Feldman

CEO and Founder, Cerebras Systems

2 minute read

Computer chips are like cars, says Andrew Feldman, co-founder and CEO of the AI chip startup, Cerebras Systems. “You make trade offs that are optimized for what the job of the part is: If you want to move bricks and lumber, don't buy a minivan.” But over the past decade, graphical processing units (GPUs), designed for rendering graphics like those in video games, have become the industry standard for machine learning. Feldman and Cerebras are changing that by designing a chip specifically for AI.

“In 2015, we saw the rise of AI on the horizon,” Feldman says. “We asked ourselves, ‘can we make something better for it?’”

Cerebras Systems spent $400 million over three years designing a new semiconductor specifically for AI workloads. The result is what the company calls a Wafer Scale Engine: a dinner plate-sized chip, about 57 times the size of a GPU, making it the largest chip ever. Its third-generation chips can train models in a fraction of the time of GPUs, helping win notable clients including Mayo Clinic and Emirati technology group G42. Cerebras Systems says the new cloud platform it announced in August, built on its chips, can run Meta’s Llama models up to 20 times faster than a cluster of the industry standard Nvidia H100 GPUs, and twice as fast as its competitor Groq’s solution.

Just as improvements in internet speed allowed for the sharing of images, and then the streaming of high-resolution video, advances in the speed of AI chips will open new possibilities, Feldman says. Still, the company is a long way off making a noticeable dent in Nvidia’s more-than-80% share of the AI chip market. “I’m a professional David in the battle of Goliath,” he adds. “Sometimes the best technology doesn't win. We have to try and be sure that it does.”

More Must-Reads from TIME

Contact us at letters@time.com