As AI systems proliferate, the demand for computing power to crunch large data sets has become monstrous. Cerebras, founded in 2015, has responded with the largest computer chip ever. Its Wafer-Scale Engine 3 is 8 sq. in., with 4 trillion transistors. Putting everything on one wafer rather than networking many chips reduces data-transfer times and energy use for the most compute-intense AI jobs, says CEO Andrew Feldman. “We didn't repurpose a graphics-processing device. We said, ‘What would we do if this was the only problem, the full purpose of our existence?’” In Cerebras’ multimillion-dollar CS-2 supercomputer, the company’s chips have been put to work on jobs like building AI medical assistants for the Mayo Clinic. While planning an IPO, the company is building the third of nine $100 million supercomputers that will be interconnected by Emirati AI firm G42 to build “the world’s largest supercomputer for AI training.”
More Must-Reads from TIME
- The 100 Most Influential People in AI 2024
- Inside the Rise of Bitcoin-Powered Pools and Bathhouses
- How Nayib Bukele’s ‘Iron Fist’ Has Transformed El Salvador
- What Makes a Friendship Last Forever?
- Long COVID Looks Different in Kids
- Your Questions About Early Voting , Answered
- Column: Your Cynicism Isn’t Helping Anybody
- The 32 Most Anticipated Books of Fall 2024
Contact us at letters@time.com