On Tuesday, the Catholic Substack newsletter The Pillar published an investigation into Monsignor Jeffrey Burrill, who had, up until that day, been the top administrator in the U.S. Conference of Catholic Bishops based in Wisconsin. Burrill resigned, The Pillar said, in anticipation of their report, which alleged he had regularly used the LGBTQ dating app Grindr and visited gay bars from 2018 to 2020.
Their source? “Commercially available app signal data.”
Catholic and LGBTQ advocates alike condemned The Pillar’s report as homophobic in its insinuations that Burrill’s alleged use of a LGBTQ dating app somehow proved he “engaged in serial sexual misconduct.” Others argued Burrill’s alleged behavior was hypocritical, as Catholic doctrine considers same-sex relationships a sin. Burrill himself was not immediately available for comment and has not made a statement publicly.
Regardless, many online commentators raised the same question: Wait, just how exactly did The Pillar get this information?
The article cites “commercially available app signal data” from “a mobile device correlated to Burrill” that was “obtained and analyzed by The Pillar.” It says the data “conveys mobile app data signals during two 26-week periods, the first in 2018 and the second in 2019 and 2020,” and says the information was “obtained from a data vendor and authenticated by an independent data consulting firm contracted by The Pillar.”
Privacy experts tell TIME the controversial report highlights the sorry state of the current data privacy landscape.
“It’s an excellent example of the lack of data protection in America,” says Jennifer King, a privacy and data policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. “It shows just how low the threshold is if you want to actually target an individual.”
How third party vendors get your data
It’s still unclear how exactly The Pillar obtained Burrill’s phone data and Grindr denies that it came from the app.
“We do not believe Grindr is the source of the data behind the blog’s unethical, homophobic witch hunt. We have looked closely at this story, and the pieces simply do not add up,” a Grindr spokesperson said in a statement to TIME. “Grindr has policies and systems in place to protect personal data, and our users should continue to feel confident and proud in using Grindr regardless of their religion, ethnicity, sexual orientation, or gender identity.”
Grindr did not respond to follow-up questions asking for details on how it had investigated the issue internally, but in a statement received after the initial publication of this article, said that it “has not and does not sell anonymized user data to data brokers.”
It is not yet clear how The Pillar obtained the data it analyzed. Regardless, Andrés Arrieta, director of consumer privacy engineering at the data privacy non-profit the Electronic Frontier Foundation, tells TIME the practice of sharing data with third party vendors is incredibly common among mobile apps.
“There’s an industry whose full existence is to gather as much data about everyone, and then to sell it to anyone that will buy it,” Arrieta says.
Many apps, especially free ones, sell aggregated data—which can include demographics or location information—about their users to third party vendors as an extra source of revenue; these vendors then turn around and sell that data to advertisers looking for information on particular types of users, explains King. The data is transferred under the expectation that user identities will be made anonymous.
Someone could feasibly approach one of these third party vendors, King says, and pay for a package of location data, which might include when a user logged in and out, their approximate locations, and their phone’s static ID number (a unique string of numbers assigned to each mobile device). These packages can feature users of specific apps, like dating apps, explains Ben Zhao, a professor of computer science at the University of Chicago.
The issue, King explains, is that if you wanted to find the static ID number of a particular individual’s phone, and knew identifying factors like where they lived, worked, and traveled, you could parse through all of the location data to figure out which static ID number belongs to that person.
It appears The Pillar did just this. In its report, The Pillar said it “correlated a unique mobile device to Burrill when it was used consistently from 2018 until at least 2020 from the USCCB staff residence and headquarters, from meetings at which Burrill was in attendance, and was also used on numerous occasions at Burrill’s family lake house, near the residences of Burrill’s family members, and at a Wisconsin apartment in Burrill’s hometown, at which Burrill himself has been listed as a resident.”
The Pillar did not respond to TIME’s question as to whether someone tipped them off about Burrill having an account on Grindr.
This tactic isn’t unprecedented, King says. There’ve been examples of debt collectors using similar methods to track people’s movements in the repossession industry.
“In essence, the privacy protection that you get from anonymizing things before you aggregate them and package them to be sold, is really a facade,” says Zhao. “Oftentimes companies think that they’re doing the right thing by anonymizing data, but what they’re doing actually falls short of what is really necessary to completely protect users from privacy attacks.”
“People in academia and in some industry circles have understood this for a long time,” he continues. “But I think there’s a general lack of understanding of this for the public, and that perhaps is why this particular story is so shocking to many people.”
The Conference of Catholic Bishops directed TIME to a Tuesday statement announcing Burrill had stepped down after it became aware of coming reports alleging “possible improper” behavior. “In order to avoid becoming a distraction to the operations and ongoing work of the Conference, Monsignor Burrill has resigned, effective immediately,” the statement said.
A lack of protection for users
Data privacy advocates have pointed to The Pillar’s report as the latest example of why the United States should impose stricter regulations on the buying and selling of personal user data.
“Experts have warned for years that data collected by advertising companies from Americans’ phones could be used to track them and reveal the most personal details of their lives. Unfortunately, they were right,” said Democratic Sen. Ron Wyden in a statement on The Pillar report shared with TIME. “Data brokers and advertising companies have lied to the public, assuring them that the information they collected was anonymous. As this awful episode demonstrates, those claims were bogus – individuals can be tracked and identified.”
In 2020, Wyden and Republican Sen. Bill Cassidy sent a letter signed by 10 other Senators asking the Federal Trade Commission (FTC) to investigate the online ad economy and the ways personal data, including locational information, is sold by brokers. A FTC spokesperson confirmed to TIME that they received Wyden’s letter but did not have any further comment. (FTC investigations are nonpublic.)
Congress has also failed to pass any comprehensive data privacy legislation, and only a handful of states have enacted laws tackling the issue on their own. California became the first to do so in 2018 with its Consumer Privacy Act, which intends to give users the right to ask companies to delete their data and not sell it, but doesn’t actually stop the practice by third party services, King explains.
Arrieta argues regulation should make it so users opt into their data being collected and sold, rather than opting out. Regulation will also need an enforcement mechanism, he argues, and users need to be given the ability to see what data is being collected on them, who it’s being shared with and the option to delete it.
The European Union’s model for privacy protections is the strongest in the world, and its General Data Protection Regulation law, implemented in 2018, has taken steps to crack down on the collection of data in the ad tech industry. Yet still, Arrieta explains, The Pillar’s investigation could have happened in any country.
Legislation won’t be a complete fix for the U.S. though, Zhao argues. It will also take a higher level of awareness among consumers, he says, and leadership from tech companies to strengthen their privacy policies.
Arrieta says he has hope that greater privacy protections are on the way—but cautions it’ll be an uphill battle. “There’s billions of dollars in this industry,” he says. “It’s gonna be a big fight.”