There probably have been times when you wondered what technology companies can do with your data. It might have been all the moments you realized how surprisingly well-targeted Facebook’s ads are. Or maybe it was the time that advertisement ran on your Gmail page, asking whether you and your boyfriend just broke up. (Extra points for Gmail if you really did.)
Facebook aroused the ire of privacy activists across the country and in Europe after news broke last week that researchers manipulated users’ News Feeds as part of a so-called “emotional contagion” scientific study. Working with two outside researchers, Facebook tweaked the number of posts classified as “happy” or “sad” on nearly 700,000 users’ News Feeds in order to see how their emotions would change. It did it without those users’ consent.
Facebook didn’t break any laws in the United States, experts say. But reports of Facebook’s study infuriated many users who felt their privacy had been violated, and that their emotions had been deliberately manipulated without their knowledge.
Many worried that Facebook’s research isn’t subject to the same levels of oversight as research institutions.
What kind of research does Facebook conduct?
Facebook experiments run the gamut from simple A/B testing (marketing jargon for learning what consumers prefer by presenting them with two scenarios) to more elaborate research published with outside help, like the emotional contagion study. Data teams use advertisement targeting tests to find out what ads users prefer to click on, and reshuffle users’ News Feeds to see which format users engage with best. Facebook does that kind of research a lot.
Does Facebook have any restrictions in the kinds of research it can do?
Facebook has few limits on the kinds of research it can conduct with your data, as long as that research is internal to the company. Here’s the crucial clause in Facebook’s data privacy policy:
What Facebook is saying is that after people have agreed to its terms and conditions, Facebook can do the data research it wants to do.
“When someone signs up for Facebook, we’ve always asked permission to use their information to provide and enhance the services we offer,” a Facebook spokesperson told TIME. “To suggest we conducted any corporate research without permission is complete fiction.”
Facebook also controls your News Feed, meaning that it can essentially do what it wants to change it, which includes rearranging statuses on your page, as it did in the controversial study. It says as much here, in its terms of service:
So once you click on the terms of service, there’s not much stopping Facebook from doing what it wants with your information, and changing the way you interact with the platform. And it does it all the time.
“There aren’t a whole lot of legal constraints on the things that Facebook and other online service providers can do in terms of dealing with data they have about their own users,” Jon Penney, a lawyer and research fellow of internet policy at Harvard University told TIME. “In terms of U.S. law, if Facebook is abiding by its terms of service, they’re going to be fine.”
How does Facebook monitor the ethics of its own research?
Most research institutions have a highly formal internal review processes that involve panels called Institutional Review Boards. But Facebook doesn’t have the same internal review processes that universities and institutions do.
Until recently, Facebook’s review process was largely informal and ad hoc.
“While I was at Facebook, there was no institutional review board that scrutinized the decision to run an experiment for internal purposes,” Andy Ledvina—a data scientist at Facebook until March 2013 and a software engineer until April 2014—told TIME. Most research (A/B testing, advertisement targeting, News Feed tweaking) did not require formal review.
But, Ledvina said, any research that went to a journal looped in public relations and Facebook’s legal department over what could be published. “People aren’t just running experiments willy-nilly,” Ledvina said. “Those who do run such experiments, have very high ethical standards and experience running experiments both inside and outside academia.”
Facebook has since tightened its review standards, formalizing a review process over the course of 2013. By the start of 2014, research intended for publication had to be reviewed by privacy and legal experts.
Also, more prosaic research like A/B testing and ad targeting is now reviewed for privacy implications as well.
Facebook views customer research as a normal company practice that improves users’ time on the social media site. After all, Facebook is far from the only company that uses data to target customers. Grocery stores keep tabs on what sells and what doesn’t; advertisers conduct in-depth market surveys; and retailers target demographics and customer profiles.
“Our research is designed to understand how people use Facebook and how we can make our services better for them,” Facebook’s spokesperson said.
But what worries privacy activists about Facebook is the sheer intimacy of users’ content, and the vast quantity of exposed data. “Facebook has unmatched social data based upon your seemingly natural interactions with your friends,” James Grimmelmann, a law professor at the University of Maryland, told TIME. “That’s the value proposition they’ve told their investors and their advertisers who they work with that they’re trying to leverage.”
Okay, so we’ve been talking about Facebook’s own research. What about when it works with other scientists outside Facebook?
When data leaves the company, Facebook says, it’s anonymized and de-identified, meaning that there’s theoretically no way to for outside researchers to track down individual Facebook users. Facebook believes it’s not violating anyone’s privacy when it collaborates with outside researchers.
But if Facebook collaborates with a university on a study of human behavior, the study would in most cases be subject to a formal review. And in studies like the one that involved manipulating users’ News Feeds to alter their emotions, the users would likely have to give consent—or the institution could risk losing federal funding.
Experts say current rules discourage Facebook from collaborating with outside researchers. In the scientific world, research is regulated by federal rules, and within Facebook, it isn’t. It means Facebook is freer to do what it wants when it doesn’t disclose its research, and encourages Facebook to act in secret.
“The incentives that we’re giving them are completely backwards,” said Grimmelmann. “We’re telling them, hoard this data and go ahead and conduct all the experiments that you want and don’t tell anyone. Which is the very opposite of what we want for people using Facebook and for society.”
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com