As AI systems proliferate, the demand for computing power to crunch large data sets has become monstrous. Cerebras, founded in 2015, has responded with the largest computer chip ever. Its Wafer-Scale Engine 3 is 8 sq. in., with 4 trillion transistors. Putting everything on one wafer rather than networking many chips reduces data-transfer times and energy use for the most compute-intense AI jobs, says CEO Andrew Feldman. “We didn't repurpose a graphics-processing device. We said, ‘What would we do if this was the only problem, the full purpose of our existence?’” In Cerebras’ multimillion-dollar CS-2 supercomputer, the company’s chips have been put to work on jobs like building AI medical assistants for the Mayo Clinic. While planning an IPO, the company is building the third of nine $100 million supercomputers that will be interconnected by Emirati AI firm G42 to build “the world’s largest supercomputer for AI training.”
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com