As AI systems proliferate, the demand for computing power to crunch large data sets has become monstrous. Cerebras, founded in 2015, has responded with the largest computer chip ever. Its Wafer-Scale Engine 3 is 8 sq. in., with 4 trillion transistors. Putting everything on one wafer rather than networking many chips reduces data-transfer times and energy use for the most compute-intense AI jobs, says CEO Andrew Feldman. “We didn't repurpose a graphics-processing device. We said, ‘What would we do if this was the only problem, the full purpose of our existence?’” In Cerebras’ multimillion-dollar CS-2 supercomputer, the company’s chips have been put to work on jobs like building AI medical assistants for the Mayo Clinic. While planning an IPO, the company is building the third of nine $100 million supercomputers that will be interconnected by Emirati AI firm G42 to build “the world’s largest supercomputer for AI training.”
More Must-Reads from TIME
- Cybersecurity Experts Are Sounding the Alarm on DOGE
- Meet the 2025 Women of the Year
- The Harsh Truth About Disability Inclusion
- Why Do More Young Adults Have Cancer?
- Colman Domingo Leads With Radical Love
- How to Get Better at Doing Things Alone
- Michelle Zauner Stares Down the Darkness
Contact us at letters@time.com