Workers replace cables at a telecommunications hub in New York where companies exchange Internet traffic to boost efficiency.
Peter Garritano

How America Risks Losing Its Innovation Edge

A version of this piece appears in a book published this month by the Aspen Strategy Group titled “Technology and National Security.”

For the past fifty years, the rational exuberance of the American economy has been propelled by the combination of three innovations: the computer, the microchip, and the internet.

The research and development that produced each came from a triangular alliance of government, academia, and private business. The first computers were funded by the military, built at the University of Pennsylvania and Harvard, then commercialized by companies such as Univac and IBM. Transistors were invented at Bell Labs, then federal funding for the space and strategic missile programs led private companies such as Fairchild and Intel to devise ways to etch thousands of them onto small silicon chips. And the internet was famously conceived by DARPA and built by research universities working with private contractors such as BBN.

This tripartite machine of government working with universities and private corporations was not merely a random array with each group pursuing its own aims. Instead, during and after World War II, the three groups had been purposely fused together into an innovation triangle.

The person most responsible for forging this assemblage was Vannevar Bush, an MIT professor who in 1931 had built an early analog computer. Bush was well-suited to this task because he was a star in all three camps: dean of the MIT School of Engineering, a founder of the electronics company Raytheon, and America’s top military science administrator during World War II.

Dr. Vannevar Bush, right, and E.S. Lamar demonstrate the Dr. Robert Van De Graf 1,500,000 volt generator at the annual Dinner of Technology Alumni in the Copley Plaza. The generator is said to be capable of producing a shock of 10,000,000 volts.Dr. Vannevar Bush, right, and E.S. Lamar demonstrate the Dr. Robert Van De Graf 1,500,000 volt generator at the annual Dinner of Technology Alumni in the Copley Plaza. The generator is said to be capable of producing a shock of 10,000,000 volts. Bettmann Archive/Getty Images 

He was passionate about elevating the role of science and engineering in society at a time—the mid-1930s—when not much exciting seemed to be happening in either field. The most notable new inventions put into the time capsule at the New York 1939 World’s Fair were a Mickey Mouse watch and a Gillette safety razor. The advent of World War II would change that, producing an outpouring of new technologies with Bush leading the way.

Worried that America’s military was lagging in technology, he mobilized Harvard President James Bryant Conant and other scientific leaders to convince President Roosevelt to form the National Defense Research Committee and then the military’s Office of Scientific Research and Development, both of which he headed. With an ever-present pipe in his mouth and a pencil in his hand, he oversaw the Manhattan Project to build the atom bomb as well as projects to develop radar and air-defense systems.

When the war ended, Bush produced a report in July 1945 at the behest of President Roosevelt (which ended up being delivered to President Truman) that advocated government funding of basic research in partnership with universities and industry. Bush chose an evocative and quintessentially American title for his report: “Science, The Endless Frontier.” His introduction deserves to be reread whenever politicians threaten to defund the research needed for future innovation. “Basic research leads to new knowledge,” Bush wrote. “It provides scientific capital. It creates the fund from which the practical applications of knowledge must be drawn.”

The war, Bush wrote, had made it “clear beyond all doubt” that basic science—discovering the fundamentals of nuclear physics, lasers, semiconducting materials, computer science, radar—“is absolutely essential to national security.” It was also, he added, crucial for America’s economic security. “New products and new processes do not appear full-grown. They are founded on new principles and new conceptions, which in turn are painstakingly developed by research in the purest realms of science. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade.”

Bush’s description of how basic research provided the seed corn for practical inventions became known as the “linear model of innovation.” Based on this report, Congress established the National Science Foundation.

President Harry Truman, center, presents the Medal for Merit to Dr. Vannevar Bush, left, and Dr. James Bryant Conant, right, for their atomic research.President Harry Truman, center, presents the Medal for Merit to Dr. Vannevar Bush, left, and Dr. James Bryant Conant, right, for their atomic research. AP/REX/Shutterstock 

Most important, government spending was not funneled into government-run labs, as had happened with the Manhattan Project. Instead, government research funding went to universities and private contractors. “No American had greater influence in the growth of science and technology than Vannevar Bush,” MIT President Jerome Wiesner proclaimed, adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories.”

The creation of a triangular relationship between government, industry, and academia was, in its own way, one of the significant innovations that helped produce the technological revolution of the late 20th century. The Department of Defense soon became the prime funder of much of America’s basic research. By 1965, 23 percent of the federal government’s funding for university science came from the Pentagon—almost twice as much as from the National Science Foundation. The return on that investment was huge, leading not only to the internet, but to many of the pillars of America’s postwar innovation and economic boom.

A few corporate research centers, most notably Bell Labs, existed before the war. Bell Labs brought together theoreticians, materials scientists, metallurgists, engineers, and even telephone-pole climbers. Bell Labs showed how sustained innovation could occur when people with a variety of talents were brought together, preferably in close physical proximity where they could have frequent meetings and serendipitous encounters.

After Bush’s clarion call produced government contracts, other corporate research centers began to proliferate. Xerox created the Palo Alto Research Center, known as Xerox PARC, that had as one of its leaders Bob Taylor, who had helped create the internet while running DARPA’s Information Processing Techniques Office. Xerox PARC developed the graphical user interface now used on personal computers, the ethernet, and dozens of other innovations that became part of the digital revolution.

In addition, hybrid labs combining government, academia, and industry were launched. Among the most notable were the RAND Corporation, originally formed to provide research and development (hence the name) to the Air Force, and Stanford Research Institute (SRI).

Many of America’s most important new private corporations were spawned by the three-way relationship of Bush’s innovation triangle.

Take Google, for example. Larry Page’s father was a professor of computer science and artificial intelligence at Michigan State, the recipient of large federal research grants. Sergey Brin’s parents were refugees from Russia who received visas to come to the US. His father became a math professor at the University of Maryland, where the Department of Defense funded ways to calculate missile trajectories, and his mother became a researcher at the nearby NASA Goddard Space Flight Center.

Google co-founders Larry Page, left, and Sergey Brin, at their campus headquarters in Mountain View, Calif., 2003.Google co-founders Larry Page, left, and Sergey Brin, at their campus headquarters in Mountain View, Calif., 2003. Kim Kulish—Corbis/Getty Images 

Both Larry and Sergey ended up at Stanford as graduate students in a government-funded program called the Digital Libraries Initiative. The money came from the National Science Foundation and a consortium of other federal agencies. With their tuition paid by this program, they came up with systems called BackRub and PageRank that indexed the World Wide Web. Thus was Google born.

Another great innovation spurring the American economy and competitiveness was likewise funded by the federal government through universities and corporate labs. Beginning with the presidency of George H.W. Bush, the Human Genome Project sequenced DNA and launched a revolution in biomedicine that will produce the most important innovations and discoveries of the twenty-first century. “Through it all, the federal government invested heavily in basic research,” says Eric Lander, one of the leaders of the genome project. “That policy made American universities engines of discovery that attracted the best talent to our shores and sparked the world’s most innovative companies.”

The question now, Lander says, is “whether America will yield its position as the world’s leader in science and technology. For the first time since World War II, our primacy is in jeopardy.”

A 2017 report from the Atlantic Council echoed Vannevar Bush’s phrasing when it called such examples of federally funded basic research at university and corporate labs “the nation’s scientific seed corn, enabling basic, pre-competitive R&D that will mature into harvestable technologies in the future.” However, the report noted, “federal R&D spending has shrunk significantly over the last few decades; once the world leader, the United States now ranks twelfth in government-funded R&D spending as a percentage of GDP.” Federal R&D spending has declined from about 1.2 percent of GDP in 1976 to less than 0.8 percent in 2016. This is the lowest level since the pre-Sputnik era, and in the Aspen Strategy Group, there may still be a few people who know what the pre-Sputnik era was.

Some of this decrease in federal funding has been replaced by an increase in corporate research, especially in sectors such as the pharmaceutical industry, where it is clear that research can lead directly to valuable products. In the 1960s, around 70 percent of total R&D was federally funded, with 30 percent coming from the private sector. Now those figures are reversed.

Corporate funding tends to be more focused on products. As the balance has shifted away from government funding at university research labs, there has been a reduction in basic scientific research that is aimed at creating the fundamental theoretical knowledge that can produce the seed corn that will eventually lead to great innovations.

This decline in scientific investment in basic research and university labs is not a partisan phenomenon or a product of the Trump administration. For almost twenty-five years, federal funding for university research and state funding for higher education has been in decline. Between 2011 and 2015, during the Obama administration, federal investment in university research declined by 13 percent.

But it’s now getting even worse. In the latest proposed budgets from House Republicans and the Trump administration, science and technology research federal funding would be cut by an additional 15 percent.

In addition, despite the launch of some corporate labs such as GoogleX, private corporations have largely dismantled the research institutes like Bell Labs and Xerox PARC, partly in the face of challenges from short-term investors who demand a shorter time horizon in returns on investment.

The potential economic and security ramifications can be foreshadowed by looking at the opposite approach now being taken by China, which is heavily funding basic scientific research, including in vital fields such as artificial intelligence (AI) and genetic engineering.

Baidu founder Robin Li Yanhong walks after the opening ceremony of the first session of the 13th National Committee of the Chinese People's Political Consultative Conference (CPPCC) at the Great Hall of the People on March 3, 2018 in Beijing.Baidu founder Robin Li walks after the opening ceremony of the first session of the 13th National Committee of the Chinese People's Political Consultative Conference (CPPCC) at the Great Hall of the People on March 3, 2018 in Beijing. VCG/Getty Images 

Take the AI sector, for example. In its 13th Five-Year Plan released in 2016, China’s leadership announced its ambition to transform China into a “nation of innovation” by launching fifteen “Science and Technology Innovation 2030 Megaprojects.” These included big data, intelligent manufacturing, and robotics. It was a steroid-charged version of Bush’s 1945 paper urging America to combine federal dollars with university and corporate labs. A year ago, in May 2017, China added “Artificial Intelligence 2.0” as the sixteenth mega-project.

The goal of this project is audacious yet simple: to make China the world leader in AI by 2030. Combining government dollars with corporate and academic initiatives, China is now building an ecosystem that would transcend even Bush’s wildest dreams.

The local government of Tianjin, a city two hours from Beijing, is raising a $5 billion fund to support AI development, and the central government is building a $2.1 billion AI technology park in Beijing's western suburbs.

Guided by the government’s vision, money is also flowing into the Chinese private sector. Venture funds and other private funds invested $4.5 billion into more than 200 Chinese AI companies between 2012 and 2017, according to Kai-Fu Lee, a former Google and Microsoft executive who now leads a venture capital firm, Sinovation Ventures. The AI start-up SenseTime raised $600 million in a deal led by Alibaba, giving SenseTime an implied valuation of more than $3 billion. CB Insights reports that, by certain types of measurement, China has overtaken the US in the funding of AI start-ups. For example, China accounted for 48 percent of the world’s AI startup funding in 2017, compared to 38 percent for the US.

The funding and investments are already paying off. China’s students and programmers are now routinely winning international competitions in AI and machine learning. Baidu is at the forefront of AI, with 2,000 researchers, including in offices in Silicon Valley and Seattle. It now rivals Google as a global leader in AI research and boasts the most accurate and powerful program for speech recognition. According to the White House's National Artificial Intelligence Research and Development Strategic Plan, in AI research, China has surpassed the US in the number of journal articles that mention "deep learning" or "deep neural network.” At the annual Association for the Advancement of Artificial Intelligence conference, the percentage of Chinese authors of AI research papers presented grew from 10 percent to 23 percent between 2012 and 2017, while the percentage of US authors declined from 41 percent to 34 percent.

China has one other advantage that the US should not envy. It has fewer restrictions on data collection and less compunction about violating personal privacy. This is unnerving, but it is also an advantage because big data will fuel many AI advances. China sits on a growing reservoir of big data, making it, as The Economist put it, "the Saudi Arabia of data.”

China’s version of Vannevar Bush’s innovation triangle is a “military-civil fusion” that encourages the collection of data on citizens. It uses facial recognition technology for domestic surveillance. In Shenzhen, for example, there are cameras on poles with reminders saying, “Jaywalkers will be captured using facial-recognition technology.” Cross the street improperly, and your face and name are likely to be displayed publicly on a nearby screen and put into a database.

People experience the face recognition payment system during Baidu Create 2018 at China National Convention Center in Beijing, July 4, 2018.People experience the face recognition payment system during Baidu Create 2018 at China National Convention Center in Beijing, July 4, 2018.  Li Xin—Xinhua/Redux 

Enter a search query into Google, and the company may gather the data to improve its algorithm and market products to you; make the same search on Baidu, and your data also goes into a government-controlled database. The same data collection policies apply every time someone in China uses a WeChat wallet, shops online on Taobao, or hails a ride with Didi. Baidu uses facial recognition of its employees to open the security gates in its lobby, and the technology allows customers at Kentucky Fried Chicken to authorize a payment via facial scan. The technology is also used to recognize passengers at airport security gates. When US Customs and Border Protection last year floated a plan to do the same to verify the identity of people boarding certain flights in the US, a controversy erupted.

As these examples show, there are elements of China’s technology and innovation initiatives that the US will not wish to emulate. That is true of facial recognition, AI, and big data, and it is also true in gene editing, cloning, and other types of biotechnology where China has fewer ethical and policy restrictions.

But the political and ethical restrictions in the US make it even more important for the US to stay ahead of China in other ways, most notably funding basic research into science and investing in university and corporate labs.

A good place to start would be revitalizing our investments in research universities, now being decimated by cuts and other challenges. The US has thirty-two of the top fifty universities in the world, magnets for the world’s best students. “But America seems increasingly unwelcoming to foreign students, whose applications this year have fallen by as much as 30 percent in some programs,” says Lander, who was a leader of President Obama’s President’s Council of Advisors on Science and Technology. “Will the next generation of entrepreneurs and leaders from around the world study elsewhere? At the same time, the Trump administration has proposed slashing funding for basic research. Congress came within a hair’s breadth of taxing graduate student fellowships and did impose a tax on university endowments, which help fund costs not covered by tuition.”

Reversing such policies is the critical first step to creating, once again, the research breakthroughs that will lead to future innovations, rather than continuing on America’s new path of destroying our seed corn before the next harvest.

Isaacson is a professor of history at Tulane University and a best-selling biographer. His most recent book is Leonardo da Vinci. A version of this piece appears in a book published this month by the Aspen Strategy Group titled “Technology and National Security.”


Ideas
TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.
TIME may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.