Presented By
Director of National Intelligence James Clapper testifies on Capitol Hill in Washington, Wednesday, Jan. 29, 2014, before the Senate Intelligence Committee hearing on current and projected national security threats against the US.
Pablo Martinez Monsivais—AP

A new study calls the threat of catastrophic cyber-security failures overblown and says the government’s plan for fixes will ultimately make the Internet less secure.

“This is a really complex and dynamic system,” said the study’s lead author Eli Dourado, a tech policy research fellow at George Mason University’s Mercatus Center, a libertarian-leaning think tank. “What they’re trying to do is just beyond the capacity of humans to plan and control.”

The study lambasts the Commerce Department’s “Cybersecurity Framework,” which was released in February and recommends a series of voluntary measures to help “operators of critical infrastructure”—like power plants, phone networks, financial services—develop better defenses against cyber-attacks. The framework, implemented by President Barack Obama through an executive order, seeks to impose a bit of order on the historically anarchic and ad hoc processes by which the Internet has been secured in the decades since it came into being.

“The framework would replace this creative process with one rigid incentive toward compliance with recommended federal standards,” the study says. In short, the authors argue it would shift the emphasis to complying with a federal standard rather than “the spontaneous, creative sources of experimentation and feedback that drive Internet innovation.”

The plan for a federally sponsored, public-private partnership to establish a national cyber-security protocol grew out of the fear that human society, ever more digitized and interconnected, sits tenuously on the precipice of disaster should the machines ever stop working. The fear of collapse has been stoked by the likes of former Defense Secretary Leon Panetta—who has warned of a possible “cyber-Pearl Harbor”—and intelligence chief James Clapper, who asserted that cyber-threats “cannot be overstated.”

“There’s not really any good evidence for cyber doom scenarios, the digital pearl harbor that everyone talks about,” Dourado said. “There’s people who benefit, of course, from the perception that there could be cyber doom scenarios, such as government contractors and people who have hitched their wagons to this idea that we need huge programs in order to stop these things from happening.”

Dourado concedes there may be evidence to support “cyber doom” alarmism that is classified. “If it exists it should be declassified,” he said. “This isn’t the cold war. We can have an open conversation about this.”

Among the authors’ recommendations for improving web security in lieu of the Cybersecurity Framework—which they believe, not without reason, may someday become compulsory—is addressing this issue of over-classification.

MORE: Should President Obama be on the 2014 Time 100? Vote now.

The study notes that, according to one research group, 2013 was the worst year ever for data breaches, but not of the sort that might send a wall of water crashing through an incapacitated Hoover Dam. Rather, the U.S. has seen more of the smaller scale security failures like customer credit card data being stolen or the IRS losing track of employee records. The study recommends jump-starting the development of a market for cyber-threat insurance to mitigate the damage from those incidents in addition to more narrowly defining what constitutes “critical infrastructure”

“We’re not saying ‘Stop securing our resources,’” Dourado said. “We’re just saying we need to think about this as a system that can’t be controlled or planned by the government.”

More Must-Reads From TIME

Contact us at

You May Also Like