What If the New York Times Experimented on You Like Facebook?

3 minute read

Suppose a major media outlet had used you, unwittingly, in an experiment to study how its content could affect your emotions? Suppose The New York Times, or ABC News–or TIME magazine–had tweaked the content it displayed to hundreds of thousands of users to see if certain types of posts put readers in a certain frame of mind. The outcry would be swift and furious–brainwashing! mind control! this is how the biased media learns to manipulate us! It would be decried as not just creepy but professionally unethical. And it’s hard to imagine that the publication’s leadership could survive without promising it would never happen again.

Facebook, we recently learned, did just that: in a study conducted in 2012, the company adjusted the news feeds of nearly 700,000 users to display more positive or negative status updates, to determine whether and how the changes would affect users’ emotions. There was indeed an outcry; we may love spinning on the hamster wheel of social media but no one likes being an unwilling guinea pig. And one of the Facebook researchers behind the study apologized–well, sort of: he was sorry, anyway, “for the way the paper described the research and any anxiety it caused.” But the company also noted that its users agree to this sort of thing when they agree to the terms of service. What? Don’t tell me you don’t read every terms of service you click on!

Sorry, that’s not good enough. As a company, Facebook may have every legal right to pursue its own interests–here, trying to ensure that its user experience is as engrossing as possible. But as one of the biggest filters through which people now receive news (along with update on their cousins’ dogs having puppies), Facebook has as much ethical obligation to deliver that experience without hidden manipulation as does a newspaper.

I know, I know: Facebook has said, repeatedly, that it is a tech and not a media company. I don’t blame it! If I were trying to sell stock in my own enterprise, I wouldn’t call myself a media company either.

But semantics aside, for practical purposes, the social media giant is definitely in the media business, whatever other businesses it’s also in. Facebook’s choices and mysterious algorithms increasingly affect the flow of traffic to news sites, and therefore how those sites package and choose their offerings. (Whether you ever see this post, and many others, will often depend on how well it carries on Facebook.) That Facebook is not mainly a media-content creator makes it no less massive a factor in the media ecosystem. And as such, it can’t complain about being held to a code of media ethics.

Let’s face it, Facebook is no more likely to really suffer from this blowup than over the past controversies over its privacy policies. But if it’s able to beg off from following the standards of media conduct while exerting a ever-greater influence over media, the rest of us will suffer. Facebook can put whatever it wants in the fine print. That shouldn’t keep us from saying that this kind of grossness is wrong, in bold letters.

More Must-Reads From TIME

Contact us at letters@time.com