When University of Illinois Astrophysicist Larry Smarr discovered that the computers on campus were too small to simulate the behavior of quasars, those mysterious and distant starlike objects, he began casting around for a machine that could do the job. But Smarr soon realized that most of the hundred or so supercomputers powerful enough to serve his needs were either in the hands of private industry or tied up doing work for the Department of Defense. He finally had to use an American-made Cray 1 at West Germany’s Max Planck Institut. “The Germans were extremely gracious,” he says. “But it was somewhat ironic.”
Smarr may soon be able to spend more time in Illinois. Last week the National Science Foundation announced that he would be taking charge of one of the new supercomputer centers it is establishing at four U.S. universities. The selected schools–Princeton, Cornell, the University of California at San Diego (UCSD) and the University of Illinois at Urbana/Champaign–were immediately dubbed the “supercomputer U’s.” Their new machines, acquired through a $200 million NSF grant, will be the core of a network connecting some 30 other schools. The network should dramatically increase the high-speed computing power available to thousands of researchers.
For many scientists and engineers, the Government’s action came none too soon. Progress in every field of science, from molecular biology to particle , physics, has become increasingly dependent on massive computing power. Scientists using supercomputers have been able to pry into nature in a way not possible before. By simulating everything from wind turbulence to gravitational fields, they have studied the mechanisms of thunderstorms, the optimum shape of an H-bomb, even the structure of the universe. Says Cornell Physicist Kenneth Wilson, the Nobel laureate who led the lobbying effort that resulted in last week’s announcement: “The stakes are enormous.”
Indeed, U.S. scientists in recent years have been losing their competitive edge over colleagues in Europe and Asia. The NSF program that provided schools with state-of-the-art computers throughout the 1950s and 1960s was halted in 1972. Meanwhile, the governments of France, Germany, England and Japan aggressively subsidize supercomputer purchases for their leading universities. Says NSF Program Director Larry Lee: “It will take the U.S. a good two years to catch up.”
Scientists are already queuing up with their pet projects. University of Pennsylvania Economist Lawrence Klein wants supercomputer time to build a comprehensive model of the world economy. At the University of Illinois, Meteorologist Robert Wilhelmson hopes to simulate the birth of a tornado. Hidenori Murakami, a structural engineer at UCSD, aims to predict the effects of earthquakes on skyscrapers, bridges and other structures. And at Cornell, researchers working under Wilson want to use their new machine to design a supercomputer a thousand times more powerful than the one they are about to receive.
More Must-Reads from TIME
- Introducing the 2024 TIME100 Next
- Sabrina Carpenter Has Waited Her Whole Life for This
- What Lies Ahead for the Middle East
- Why It's So Hard to Quit Vaping
- Jeremy Strong on Taking a Risk With a New Film About Trump
- Our Guide to Voting in the 2024 Election
- The 10 Races That Will Determine Control of the Senate
- Column: How My Shame Became My Strength
Contact us at letters@time.com