Calm Down: Facebook Isn’t Manipulating Your Emotions

5 minute read

Have you heard that you might have been Facebook’s guinea pig? That the company, working with some scientists, fiddled around with 698,003 people’s News Feeds in January 2012 and tried to make the users feel sadder (or happier) by manipulating what members read?

Shocked? Violated? Creeped out? Well, be prepared to be even more shocked, violated and creeped out. Because what Facebook did was scientifically acceptable, ethically allowable and, let’s face it, probably among the more innocuous ways that you’re being manipulated in nearly every aspect of your life.

First things first. The researchers didn’t “make” users feel sadder or happier. What they did was make it more or less likely for them to see posts that contained either slightly more negative language or slightly more positive language. Overall, those who had emotionally charged messages hidden from their News Feed used fewer words when posting, and those who did see emotional words tended to reflect the tone of their feeds when they posted. But there’s a difference between using, as the study found, one more negative word per 1,000 in a week of posts, and what psychologists would call feeling sad or depressed.

Adam Kramer of Facebook, one of the study’s co-authors, posted on an apology of sorts, for the way the study was presented. “My co-authors and I are very sorry for the way the paper described the research and any anxiety it caused,” he wrote.

But the study is not without value, says Dr. Nicholas Christakis, director of the Human Nature Lab at Yale University who has studied emotional contagion across social networks. “The scientific concerns that have been raised are mostly without merit,” he says. He points out that while the positivity or negativity of words may not be a validated measure of mood, the fact that the study found similar effects in both directions – people were affected in similar ways when the number of negative and positive words were manipulated in their feeds – suggests emotional contagion on social media is, indeed, real.

Concerns about people’s privacy being violated by the experiment may also be unwarranted. First, Facebook users know that their data is no longer exclusively their own once it’s on the site. And the whole premise of News Feed is that it’s a curated glance at the most appealing or engaging updates your network of friends might post. That’s why the Cornell University Institutional Review Board (IRB), which reviews and approves all human research studies conducted by its members, gave the experiment the green light. They determined that the study posed minimal risk of disrupting people’s normal environments or behavior, and therefore waived the need for getting informed consent from each participant (something that IRBs routinely do for studies involving medical records, prison records and educational information as long as the scientists maintain the anonymity of the owners of the data).

Should the 698,003 users have been told once the study was done? Perhaps, but only out of courtesy, and not for any legal or ethical reasons. “Certain items weren’t shown to people in their News Feed,” says James Fowler, professor of medical genetics and political science at University of California San Diego, who has collaborated with Christakis and has spoken with Facebook about the company’s research. “This sounds like something that happens to people ordinarily. As a consequence, I’m having a hard time understanding why people are so upset.”

“Things that happen to you that you aren’t aware of can be scary to people,” says Fowler. That could explain why, despite the fact that Fowler and Christakis conducted a similar intervention by seeding Facebook users’ accounts with messages from friends asking them to vote at an election, they weren’t accused of manipulating people in the same way. “It’s fascinating to me that everyone is piling on [this study] when we have already done it,” Fowler says of tweaking people’s social network to see how it influences their reactions.

It’s not that anyone condones the fact that we’re being studied and analyzed all the time (the fact that you clicked on this story was recorded by this site’s administrators, as well as how long you’re taking to read it to see if posts like these are appealing).

But if social networks are here to stay, and if, as many intriguing studies suggest, they do have some influence on the way we act and think, then it’s worth trying to figure out how they do it.

“I wouldn’t want the public outcry to shut down the science,” says Fowler. “I would much rather study it and understand it than stick my head in the sand and avoid the issue altogether.”

More Must-Reads From TIME

Contact us at letters@time.com