As AI systems proliferate, the demand for computing power to crunch large data sets has become monstrous. Cerebras, founded in 2015, has responded with the largest computer chip ever. Its Wafer-Scale Engine 3 is 8 sq. in., with 4 trillion transistors. Putting everything on one wafer rather than networking many chips reduces data-transfer times and energy use for the most compute-intense AI jobs, says CEO Andrew Feldman. “We didn't repurpose a graphics-processing device. We said, ‘What would we do if this was the only problem, the full purpose of our existence?’” In Cerebras’ multimillion-dollar CS-2 supercomputer, the company’s chips have been put to work on jobs like building AI medical assistants for the Mayo Clinic. While planning an IPO, the company is building the third of nine $100 million supercomputers that will be interconnected by Emirati AI firm G42 to build “the world’s largest supercomputer for AI training.”
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com