As email inboxes around the world are flooded with updated privacy policy notifications, the European Union’s new privacy law, the General Data Protection Regulation (GDPR), takes effect on May 25.
The first significant regulatory policy enhancement to E.U. data protection regulations in more than 20 years, the GDPR requires companies to ask consumers whether they can collect their data, answer promptly if asked what it’ll be used for and disclose significant data breaches within 72 hours. Failure to fully comply could result in fines of up to €20 million (more than $23 million) or 4% of the company’s worldwide annual revenue of the prior financial year. In other words, breaking the law could come with some serious consequences.
The seriousness of the penalties reflects a European approach to privacy that can be traced back, in large part, to German history — and to specific experiences with personal data being used for the most heinous purposes.
“There this misperception that it’s a protectionist response, but the roots are much deeper. We trace them back to World War II and the atrocities of the Nazis, who systematically abused private data to identify Jews and other minority groups,” says Anu Bradford, professor of law and director of the European Legal Studies Center at Columbia Law School.
As the Nazi regime rose to power, state control of businesses brought with it state control of information technology.
In 1930s Germany, census workers went door to door filling out punch cards that indicated residents’ nationalities, native language, religion and profession. The cards were counted by the early data processors known as Hollerith machines, manufactured by IBM’s German subsidiary at the time, Deutsche Hollerith Maschinen GmbH (Dehomag). This history became more widely known after the publication of the 2001 book IBM and the Holocaust: The Strategic Alliance Between Nazi Germany and America’s Most Powerful Corporation, which argued that those Hollerith machines not only identified Jews, but also ran the trains that transported them to concentration camps. Some historians dispute the book’s claims that IBM supported the use of its machines to carry out genocide and argue that the Nazis also used other methods, as simple as pen and paper, to round up victims just as effectively; the company hasn’t denied that its machines were used during the Holocaust, but claims “most” documents about the operations have been “lost.”
But, regardless of the company’s direct involvement, or lack thereof, it became clear how — while census data can also be used to keep a government running — the collection of citizens’ personal information could lead to direct harm for those people.
When the war ended, Germany was partitioned but state surveillance remained intact, most famously carried out by the now-defunct East German secret police force known as the Stasi.
These officials were free to screen mail, search people’s apartments, bug bedrooms and bathrooms, and torture citizens of whom they were suspicious. They kept files on everything from people’s friends to their sexual habits. In response, in 1970 West Germany approved what’s considered the country’s first modern data privacy legal protections concerning public sector data in the West German state of Hesse. This was followed by a 1977 Federal Data Protection Act designed to protect residents “against abuse in their storage, transmission, modification and deletion.”
Later, concerns about unnecessarily intrusive census questions led to a landmark 1983 Federal Constitutional Court case that declared the right of “self-determination over personal data” as a fundamental right. “That became the cornerstone of the E.U.’s views today,” says Bradford.
All German citizens became entitled to those rights after the reunification of Germany in 1990. The end of the Cold War coincided with the rise in data transfers throughout Europe in the ’90s. The process of establishing a single market also included a 1995 E.U. data protection regulation, and cautious attitudes about privacy became a European norm. Perhaps most famously, in 2014 Europe’s top court, the Court of Justice of the European Union, affirmed the so-called right to be forgotten and ruled that Google has to abide by user requests to take down “data that appear to be inadequate, irrelevant or no longer relevant” — and since then, Google has received 655,000 requests to remove about 2.5 million links, and complied with 43.3% of those requests.
Experts say the GDPR is essentially an upgrade of that 1995 law. And, Bradford says, that upgrade can be partially attributed to wider knowledge of how data has been misused, not just today but also in the past. “The understanding [of the Nazi history] is very widespread now,” she says. “Given the historical backdrop, that made the legislation intuitively more appealing and less subject to resistance.”
So what about privacy outside Europe, where the GDPR can’t be used to protect customers?
Bradford says Europe has been good about enacting protections on privacy that tend to apply to all sectors of the economy, whereas in the U.S. laws apply to certain sectors (such as healthcare) more than others. But, especially after Cambridge Analytica’s Facebook data breach and the Equifax hack raised awareness of data privacy in the first half of this year, Americans have expressed interest in the government taking more action. Polling by Reuters/Ipsos and Harris X suggests that — while faith in the government’s ability to keep their own personal data safe is another story — a growing number of Americans may be more open to more federal regulation to protect their data.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Olivia B. Waxman at olivia.waxman@time.com