At the risk of being labeled a hopeless Luddite, I’ve been thinking a lot lately about whether technology–in particular Web 2.0 or 3.0 or whatever it is we’re on–is really all that. Several things have put this front of mind: First, there’s a spate of interesting fall books questioning conventional wisdom about the digital revolution. Revenge of the Analog by David Sax looks at how companies and individuals selling actual things, like indie bookstores, board games, even–gasp–print magazines, are experiencing a comeback. The prescient Nicholas Carr punches a hole in Silicon Valley cultural hubris with Utopia Is Creepy, and quant-turned-podcaster Cathy O’Neil tells us why big data won’t save us in Weapons of Math Destruction.
Beyond a mere surge of Silicon schaedenfreude, there is a significant debate going on about the effects of technology, about whether the digital revolution has made us better off (socially) and by how much (economically). Academic Robert Gordon, author of The Rise and Fall of American Growth, which is the Thomas Piketty-esque economic must read of the year, is gaining traction in policy circles with a persuasive argument that the inventions that drove growth and productivity over the last 100 years or so weren’t the personal computer or the Internet, but the internal combustion engine, indoor plumbing and electricity.
Gordon’s research shows that the Industrial Revolution had a much bigger effect on economic growth than the PC, the iPhone, or any other gadget. Indeed, his book points out that productivity growth actually began shrinking after the 1970s, which is when digital technology really began to take off. His conclusion: unless the techno-optimists come up with some really seismic invention quickly, our children are likely to be worse off economically. He basis this off the fact that productivity is the key to stronger economic growth.
This debate has real world, immediate implications. The Fed, for example, is thinking hard about why long-term productivity has been so slow for so long, as well as why productivity has become decoupled from wage increases. Indeed, the decision about whether or not the central bank will raise not only interest rates, but perhaps even its own inflation targets, hinges on such topics.
Gordon argues that the big productivity pay off from digital technology has already come and gone, between 1996 to 2004 it was responsible for part, though not all, of higher than average growth during Bill Clinton’s tenure. Meanwhile, rising inequality, inadequate fiscal stimulus, falling educational standards and flat birthrates provide more headwinds to growth.
If he and other pessimists are right, then neither inflation nor growth are going to rise anytime soon (at least not on their own). That might argue for a higher inflation target—something that economists like Olivier Blanchard, Paul Krugman and Larry Summers have argued for—and a longer period of loose monetary policy. But you can also make an argument that there’s not a lot that monetary policy can do to fix the productivity problem anyway, and that the Fed shouldn’t risk more market bubbles from a longer period of easy money. Either way, this will undoubtedly be a big topic of debate at next week’s Jackson Hole conference.
Those who keep track of such things should bone up on their Gordon.
More Must-Reads from TIME
- Where Trump 2.0 Will Differ From 1.0
- How Elon Musk Became a Kingmaker
- The Power—And Limits—of Peer Support
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com