Edward Snowden used widely available automated software to steal classified data from the National Security Agency’s networks, intelligence officials have determined, raising questions about the security of other top-secret military and intelligence systems under the NSA’s purview.
The New York Times, citing anonymous sources, reported that the former NSA contractor used a Web crawler, cheap software designed to index and back up websites, to scour the NSA’s data and return a trove of confidential documents. Snowden apparently programmed his search to find particular subjects and determine how deeply to follow links on the NSA’s internal networks.
Investigators found that Snowden’s method of obtaining the data was hardly sophisticated and should have been easily detected. Snowden accessed roughly 1.7 million files, intelligence officials said last week, partly because the NSA “compartmented” relatively little information, making it easier for a Web crawler like the one Snowden used to access a large number of files.
- TIME's Top 100 Photos of 2021
- Inside Frances Haugen's Decision to Take on Facebook
- Why We Should Stop Freaking Out About Inflation
- Austria's Plan to Make COVID-19 Vaccines Compulsory Is Dividing Citizens — and Experts
- Inside the 80-Year Quest to Name Pearl Harbor's Unknown Victims
- Buying a House Feels Impossible These Days. Here Are 6 Innovative Paths to Homeownership
- 'They're Very Close.' U.S. General Says Iran Is Nearly Able to Build a Nuclear Weapon
- A Charter School's Racial Controversy Reveals the Real Battle For America's Classrooms