You are getting a free preview of a TIME Magazine article from our archive. Many of our articles are reserved for subscribers only. Want access to more subscriber-only content, click here to subscribe.
Fifty years ago this week–shortly after lunch on Dec. 23, 1947–the Digital Revolution was born. It happened on a drizzly Tuesday in New Jersey, when two Bell Labs scientists demonstrated a tiny contraption they had concocted from some strips of gold foil, a chip of semiconducting material and a bent paper clip. As their colleagues watched with a mix of wonder and envy, they showed how their gizmo, which was dubbed a transistor, could take an electric current, amplify it and switch it on and off.
That Digital Revolution is now transforming the end of this century the way the Industrial Revolution transformed the end of the last one. Today, millions of transistors, each costing far less than a staple, can be etched on wafers of silicon. On these microchips, all the world’s information and entertainment can be stored in digital form, processed and zapped to every nook of a networked planet. And in 1997, as the U.S. completed nearly seven years of growth, the microchip has become the dynamo of a new economy marked by low unemployment, negligible inflation and a rationally exuberant stock market.
This has been a year of big stories. The death of Princess Diana tapped a wellspring of modern emotions and highlighted a change in the way we define news. The cloning of an adult sheep raised the specter of science outpacing our moral processing power and had a historic significance that will ripple through the next century. But the story that had the most impact on 1997 was the one that had the most impact throughout this decade: the growth of a new economy, global in scope but brought home in the glad tidings of personal portfolios, that has been propelled by the power of the microchip.
And so TIME chooses as its 1997 Man of the Year Andrew Steven Grove, chairman and CEO of Intel, the person most responsible for the amazing growth in the power and innovative potential of microchips. His character traits are emblematic of this amazing century: a paranoia bred from his having been a refugee from the Nazis and then the Communists; an entrepreneurial optimism instilled as an immigrant to a land brimming with freedom and opportunity; and a sharpness tinged with arrogance that comes from being a brilliant mind on the front line of a revolution.
Like his fellow wealth builders of the digital age, Grove’s mission is his product, and he shuns the philosophical mantle and higher callings often adopted by titans of an earlier era. Ask him to ruminate on issues like the role of technology in our society, and his pixie face contorts into a frozen smile with impatient eyes. “Technology happens,” he clips. “It’s not good, it’s not bad. Is steel good or bad?” The steel in his own character comes through at such moments. He has a courageous passion alloyed with an engineer’s analytic coldness, whether it be in battling his prostate cancer or in guiding Intel’s death-defying climb to dominate the market for the world’s most important product.
These traits have allowed Grove to push with paranoiac obsession the bounds of innovation and to build Intel, which makes nearly 90% of the planet’s PC microprocessors, into a company worth $115 billion (more than IBM), with $5.1 billion in annual profits (seventh most profitable in the world) and an annual return to investors of 44% during the past 10 years. Other great entrepreneurs, most notably the visionary wizard Bill Gates, have become richer and better known by creating the software that makes use of the microchip. But more than any other person, Andy Grove has made real the defining law of the digital age: the prediction by his friend and Intel co-founder Gordon Moore that microchips would double in power and halve in price every 18 months or so. And to that law Grove has added his own: we will continually find new things for microchips to do that were scarcely imaginable a year or two earlier.
The result is one of the great statistical zingers of our age: every month, 4 quadrillion transistors are produced, more than half a million for every human on the planet. Intel’s space-suited workers etch more than 7 million, in lines one four-hundredth the thickness of a human hair, on each of its thumbnail-size Pentium II chips, which sell for about $500 and can make 588 million calculations a second.
The dawn of a new millennium–which is the grandest measure we have of human time–permits us to think big about history. We can pause to notice what Grove calls, somewhat inelegantly, “strategic inflection points,” those moments when new circumstances alter the way the world works, as if the current of history goes through a transistor and our oscilloscopes blip. It can happen because of an invention (Gutenberg’s printing press in the 15th century), or an idea (individual liberty in the 18th century), or a technology (electricity in the 19th century) or a process (the assembly line early in this century).
The microchip has become–like the steam engine, electricity and the assembly line–an advance that propels a new economy. Its impact on growth and productivity numbers is still a matter of dispute, but not its impact on the way we work and live. This new economy has several features:
–It’s global. Money now respects no borders. With clicks of a keyboard, investors trade $1.5 trillion worth of foreign currencies and $15 trillion in stocks worldwide each day, putting errant or unlucky nations at the mercy of merciless speculators.
–It’s networked. Handbags from Italy and designer shoes from Hong Kong are available to Web surfers throughout cyberspace; clerical work or software programming can be outsourced from anywhere to workers in Omaha or Bangalore; and the illness of a child in Bali can be diagnosed by a doctor in Bangor.
–It’s based on information. In today’s knowledge-based economy, intellectual capital drives the value of products. In addition, from 1990 to 1996 the number of people making goods fell 1%, while the number employed in providing services grew 15%.
–It decentralizes power. As the transistor was being invented, George Orwell, in his book 1984, was making one of the worst predictions in a century filled with them: that technology would be a centralizing, totalitarian influence. Instead, technology became a force for democracy and individual empowerment. The Internet allows anyone to be a publisher or pundit, E-mail subverts rigid hierarchies, and the tumult of digital innovation rewards wildcats who risk battle with monolithic phone companies. The symbol of the atomic age, which tended to centralize power, was a nucleus with electrons held in tight orbit; the symbol of the digital age is the Web, with countless centers of power all equally networked.
–It rewards openness. Information can no longer be easily controlled nor ideas repressed nor societies kept closed. A networked world facilitates free minds, free markets and free trade.
–It’s specialized. The old economy was geared to mass production, mass marketing and mass media: cookie-cutter products spewed from assembly lines in central factories; entertainment and ideas were broadcast from big studios and publishers. Now products can be individualized. Need steel that’s tailored for your needs? Some high-tech mini-mill will provide it. Prefer opinions different from those on this page? A thousand Webzines and personalized news products are waiting to connect with you.
No one believes the microchip has repealed the business cycle or deleted the threat of inflation. But it has, at the very least, ended the sway of decline theorists and the “limits to growth” crowd, ranging from the Club of Rome Cassandras to more recent doomsayers convinced that America’s influence was destined to wane.
The U.S. now enjoys what in many respects is the healthiest economy in its history, and probably that of any nation ever. More than 400,000 new jobs were created last month, bringing unemployment down to 4.6%, the lowest level in almost 25 years. Labor-force participation has also improved: the proportion of working-age people with jobs is the highest ever recorded. Wage stagnation seems to be ending: earnings have risen more than 4% in the past 12 months, which is the greatest gain in 20 years when adjusted for inflation. The Dow is at 7756, more than doubling in three years, and corporate profits are at their highest level ever. Yet inflation is a negligible 2%, and even the dour Fed Chairman Alan Greenspan seems confident enough in the new economy to keep interest rates low.
Driving all this is the microchip. The high-tech industry, which accounted for less than 10% of America’s growth in 1990, accounts for 30% today. Every week a Silicon Valley company goes public. It’s an industry that pays good wages and makes both skilled and unskilled workers more efficient. Its products cost less each year and help reduce the prices in other industries. That, along with the global competition that computers and networks facilitate, helps keep inflation down.
Economists point out that the Digital Revolution has not yet been reflected in productivity statistics. The annual growth of nonfarm productivity during the 1980s and 1990s has averaged about 1%, in contrast to almost 3% in the 1960s. But that may be changing. During the past year, productivity grew about 2.5%. And in the most recent quarter the rate was more than 4%.
In addition, the traditional statistics are increasingly likely to understate growth and productivity. The outputs of the old economy were simpler to measure: steel and cars and widgets are easily totted up. But the new economy defies compartmentalized measurement. Corporate software purchases, for instance, are not counted as economic investment. What is the value of cell phones that keep getting cheaper, or of E-mail? By traditional measures banking is contracting, yet there has been explosive growth in automated banking and credit-card transactions; the same for the way health care is delivered.
Even the cautious Greenspan has become a wary believer in the new economy. “I have in mind,” he told Congress earlier this year when not raising interest rates, “the increasingly successful and pervasive application of recent technological advances, especially in telecommunications and computers, to enhance efficiencies in the production process.” Translation: Inventories can now be managed more efficiently, and production capacity can more quickly respond to changes in demand. A fanatic for data, Greenspan has soaked up the evidence of surging corporate investment in technology and says managers presumably are doing so because they believe it will enhance productivity and profits. “The anecdotal evidence is ample,” he says.
Anecdotal? Economists are supposed to eschew that. Yet the most powerful evidence of the way the Digital Revolution has created a new economy comes from the testimony of those embracing it. A manager at a service company in Kansas talks about not having to raise prices because he’s reaping increased profits through technology. An executive of an engine company in Ohio tells of resolving an issue with colleagues on three continents in a one-day flurry of E-mail, a task that once would have taken weeks of memos and missed phone calls. At a Chrysler plant in Missouri, a shop steward describes labor-saving technology that his union members embraced because they see how their factory, which had been shut down in the late ’80s, is now expanding. And the greatest collection of anecdotal insight, the stock market, has spent the year betting on ever increasing profits.
Of course the microchip, like every new technology, brings viruses. Increased reliance on technology has led to the threat of growing inequality and a two-tier society. Workers and students not properly trained will be left behind, opening the way for the social disruptions that accompanied the shift to the industrial age. At a time when they are most needed, schools have been allowed to deteriorate, and worker-training programs have fallen prey to budget austerity. For all the spending on computers and software ($800 billion in the U.S. during the past five years), the most obvious investment has not been made: ensuring that every schoolchild has a personal computer. Grove himself says this would be the most effective way to reboot education in America, yet he and others in the industry have been timid in enlisting in such a crusade.
In addition, though wage stagnation seems to be easing, workers’ insecurity remains high. The layoffs that have accompanied technological change have been burned into their minds like code on a ROM chip. The weakening of labor bargaining power, inherent in a global economy where jobs and investment can be shifted freely, has led to what William Greider in the Nation calls a “widening gap between an expanding production base worldwide and an inability of consumers to buy all the new output.”
There are also more personal concerns. Computer networks allow information to be accessed, accumulated and correlated in ways that threaten privacy as never before. Unseen eyes (of your boss, your neighbor, thousands of marketers) can track what you buy, the things you read and write, where you travel and whom you call. Your kids can download pornographic pictures and chat with strangers.
But these challenges can be surmounted. Technology can even provide the tools to do so, if people supply the will. As Andy Grove says, technology is not inherently good or evil. It is only a tool for reflecting our values.
If the Digital Revolution is accompanied by ways to ensure that everyone has the chance to participate, then it could spark an unprecedented millennial boom, global in scope but empowering to each individual, marked not only by economic growth but also by a spread of knowledge and freedom and true community. That’s a daunting task. But it shouldn’t be much harder than figuring out how to etch more than 7 million transistors on a sliver of silicon.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com