facebook

Here’s How Facebook’s News Feed Actually Works

Facebook Profile Page, 2014-2015. Facebook updated both the newsfeed algorithm and the privacy settings.
Courtesy of Alex Fitzpatrick/Facebook Facebook Profile Page, 2014-2015. Facebook updated both the newsfeed algorithm and the privacy settings.

How a controversial feature grew into one of the most influential products on the Internet

There are two very important rooms that will help determine the future of the Facebook News Feed and, by extension, the way more than a billion people communicate. One is in a corner of Facebook’s new 430,000-foot, Frank Gehry-designed building in Menlo Park, California. The other is in a nondescript office park in Knoxville, Tennessee.

At Facebook headquarters in California, about 20 engineers and data scientists meet every Tuesday in the “John Quincy Adding Machine” room—“Abraham Linksys” and “Dwight DVD Eisenhower” are nearby. They’re tasked with assessing the billions of likes, comments and clicks Facebook users make each day to divine ways to make us like, comment and click more. In Knoxville, a group of 30 contract workers sit in a room full of desktop computers, getting paid to surf Facebook. They are tasked with scrolling through their News Feeds to assess how well the site places stories relative to their personal preferences. Their assessments, as well as ratings from about 700 other reviewers around the United States, are later fed back to the team in California, all in the service of improving Facebook’s News Feed algorithm, the software that delivers personalized streams of content.
[time-brightcove not-tgx=”true”]

This is a relatively new vision for how to keep users hooked on Facebook—by asking users themselves. In 2014 when the program launched, the social network had already tuned the News Feed into a powerful engine, sucking up our time and pumping out ad revenue. Nearly a billion people around the world now look at Facebook daily. The company runs the second-most-popular website in the world and the most-used mobile app in the United States. American users spend nearly as much time on the site per day (39 minutes) as they do socializing with people face-to-face (43 minutes). That has turned Facebook into an online advertising behemoth that generated $12.5 billion in revenue in 2014.

News Feed is at the epicenter of Facebook’s success. Over the past nine years, the product, which was initially controversial, has evolved into the most valuable billboard on Earth—for brands, for publishers, for celebrities and for the rest of us. For years, the News Feed has been fueled by automated software that tracks each user’s actions to serve them the posts they’re most likely to engage with. That proved successful in helping News Feed generate more revenue for Facebook than any other part of the site. But it’s also led to a growing anxiety about how much Facebook knows, and how the company can use that knowledge to influence what users buy, how they vote, even how they feel.

Increasingly, though, Facebook is injecting a human element into the way News Feed operates. The company’s growing army of human raters help the social network improve the News Feed experience in ways that can’t easily be measured by “Likes.” A new curation tool launching Thursday, for instance, called “See First” will let any user choose which of their friends they want to see at the top of the feed, rather than having the decision dictated by an algorithm.

The end goal for the world’s largest social network isn’t just to guess what you’ll click on when you’re bored. The company wants to show you the things you care about most in your life, both online and off. “If you could rate everything that happened on Earth today that was published anywhere by any of your friends, any of your family, any news source…and then pick the 10 that were the most meaningful to know today, that would be a really cool service for us to build,” says Chris Cox, Facebook’s chief product officer. “That is really what we aspire to have News Feed become.” The company, it turns out, has a plan for doing just that.

First Controversy

Facebook retired “Move Fast and Break Things” as its unofficial motto in 2014, but you can still find posters emblazoned with the phrase in bold red lettering by employees’ desks inside the company’s sprawling new open workspace. Nothing epitomized the adage more than the controversial launch of the News Feed in 2006.

In its earliest days, Facebook was essentially a directory of profile pages. Users could list their favorite bands, post pictures or write on each others’ profiles, but these activities were mostly discrete. As the social network grew, Facebook engineers noticed that some people were navigating the site in unexpected ways. Every user had access to a page showing when all their friends had last made a change to their profiles. A growing number of people began bouncing from this page to different users’ profiles to figure out what their friends were up to. “Users are usually pretty lazy. They’re not really willing to jump through a lot of hoops to do most things,” says Ari Steinberg, an early Facebook engineer and former manager of the News Feed team who now runs a travel startup. He and others at Facebook realized they needed to provide an easier solution.

In September 2006, two years after its founding, Facebook unveiled a simplified way to keep track of activity on the site. They called it News Feed. The blog post announcing the new feature was true to the company’s dorm room origins, describing it as “a personalized list of news stories throughout the day, so you’ll know when Mark adds Britney Spears to his Favorites or when your crush is single again.”

Users were, for the first of many times, outraged by the update. Profile changes had previously been relatively minor. No one was prepared for their online activity to suddenly be fodder for mass consumption—even by their friends. Immediately, groups with names like “Students Against Facebook News Feed” sprouted up, condemning the new feature and amassing hundreds of thousands of members. (Ironically the News Feed itself, with its capacity for seamlessly connecting acquaintances, allowed the backlash to flourish.) But there was another factor of News Feed that was less obvious to users: it wasn’t showing people all the potential stories they could be seeing. Even in those days, Cox says, the feed was curated because there was simply too much content to show everyone everything.

The first iterations of the News Feed algorithm were pretty crude. Based largely on their own intuition about what people liked, engineers assigned point scores to different story formats (a photo might be worth 5 points, while joining a group was worth 1 point). Multiplying the score of the post type with the number of friends involved in the story would yield a general ranking order for posts. The formula might be tweaked based on emailed complaints from users or problems staffers saw in their own feeds. “We would just make all these arbitrary judgments,” recalls Steinberg.

As Facebook grew, News Feed became more flexible. Eventually the algorithm ranked content considering recency, post type and the relationship of the poster to the end user in a formula that came to be known as EdgeRank. The debut of the “Like” button in 2009, which let users endorse specific pieces of content for the first time, helped News Feed hone in even more on which stories people actually enjoyed.

Around 2011, Facebook moved on from EdgeRank to a more complex machine learning system that better individualizes each user’s experience. Instead of assuming that all users enjoy photos, the algorithm would adapt to users’ behavior so that people who click on photos see more pictures and people who don’t click on them see fewer. This is the algorithm that’s currently powering your News Feed, and the one Facebook’s engineers are constantly tinkering with. “You have a lot of impact,” Steinberg says about working on the News Feed. “When that team makes a change, the rest of the company is going to be paying attention.”

How News Feed Works

Nowadays the News Feed team has a difficult balancing act. The feed must be completely personalized but still highly engaging to Facebook’s users so they’ll keep coming back and seeing more ads from the company’s 2 million advertisers. But most users see only a sliver of the potential posts in their network each day. Facebook says the average user has access to about 1,500 posts per day but only looks at 300. (A user who scrolls endlessly will eventually see every post from their friends and a smattering of posts from Pages they follow.)

To ensure that those 300 posts are more interesting than all the rest, Facebook says it uses thousands of factors to determine what shows up in any individual user’s feed. The biggest influences are pretty obvious. How close you are to a person is an increasingly important metric, as judged by how often you like their posts, write on their Timeline, click through their photos or talk with them on Messenger, Facebook’s chat service. The post-type is also a big factor, as Facebook hopes to show more links to people who click lots of links, more videos to people who watch lots of videos and so forth. The algorithm also assumes that content that has attracted a lot of engagement has wide appeal and will place it in more people’s feeds.

But there are other, less intuitive factors to the algorithm. Use a phone with a slow mobile connection and you may see less video. Writing “congratulations” in a comment signals the post is probably about a big life event, so it will get a boost. Liking an article after you clicked it is a stronger positive signal than liking before, since it means you probably read the piece and enjoyed it.

The exact calculus that determines how all these factors coalesce is in constant flux, according to the company. Engineers are also continually running multiple experiments with about 1% of Facebook users in an attempt to boost engagement. During a week in mid-May, for instance, one test was giving greater preference to tagged photos that included close friends, while another was boosting the ranking for stories people spent more time looking at on the iPhone. Experiments that are deemed successful by News Feed’s product managers are quickly rolled out globally to all users. Big updates are noted on a News Feed blog, but smaller changes occur without fanfare. Two to three changes typically occur every week.

Despite its heavy reliance on data, News Feed’s minders are quick to point out that their directive is not only to boost engagement metrics. “What we don’t want to have News Feed turn into is a thing that is just like an aggregation of whatever you may have clicked on,” says Cox. “What we really want it to be is the most meaningful stuff.”

The new team of human raters, which Facebook calls the “feed quality panel,” are key to surfacing this meaningful content. Each day a typical panelist rates 60 stories that would actually appear in their own News Feeds on a 1 to 5 scale, judging how interesting they found the posts. They also reorder their own feeds to show how their priorities differ from the algorithm’s, providing what Facebook calls a transposition score. And they write paragraph-long explanations for why they like or dislike certain posts, which are often reviewed in the News Feed engineers’ weekly meetings. Facebook also regularly conducts one-off online surveys about News Feed satisfaction and brings in average users off the street to demo new features in its usability labs.

Some of the changes sparked by these methods have already reached you on Facebook, such as an April tweak that placed increased priority on posts by close friends and a June change that added time spent reading a post as an algorithm factor. The company sees these kinds of improvements, which can’t necessarily be measured through basic engagement metrics, as crucial. Facebook says the feed quality panel is currently representative of its U.S. userbase, but the company has plans to expand the initiative globally. “It’s fundamentally human,” says Cox of the rater data, “and it’s not fundamentally like a click.”

An Algorithmic World

As Facebook’s News Feed gets smarter and we spend more time interacting with it, its impact on the average person will likely come under increasing scrutiny. When University of Illinois Urbana-Champaign researcher Karrie Karahalios was trying to conduct a study on how people change their online behavior based on the presence of a News Feed algorithm, though, she ran into a problem. Many of the people she interviewed didn’t even realize the algorithm existed.

In Karahalios’ 2013 study, which involved 40 subjects who were selected to mimic the demographics of the U.S. population, 62% of people didn’t know that their News Feeds were being filtered. When the algorithm was explained to one subject, she compared the revelation to the moment when Neo discovers the artificiality of The Matrix. “We got a lot of visceral responses to the discovery when they didn’t know,” Karahalios says. “A lot of people just spent literally five minutes being in shock.”

Though from a small group, that reaction underscores the growing anxiety among Internet users about having algorithms help control their digital lives, not just on Facebook but on sites like Google, Amazon and Twitter. Facebook sparked a firestorm in June of last year when it published a study detailing how it purposefully changed the number of positive or negative posts in 700,000 users’ News Feeds in an attempt to alter their moods. (Facebook eventually acknowledged that the research was mishandled.) Two months later, when Ferguson was wracked with protests over the shooting of Michael Brown, media observers noted that many people’s News Feeds were being filled with cheerful videos of people dumping water on their heads in support of the charitable Ice Bucket Challenge rather than images from the news event.

[video id=2IIn49rg ]

Adam Mosseri, the project management director for News Feed, calls the Ice Bucket Challenge “anomalous” but acknowledges that it dominated other news at the peak of its popularity, which was around the time of the Ferguson protests. “There was a lot more news in the News Feed over the last year than there was ice bucket challenges, but in that week, it wasn’t the case,” he says.

Still, he argues that it’s not Facebook’s job to make sure every user sees a story that some might think is critically important. “We can’t start an editorialized feed,” Mosseri says. “That doesn’t mean we don’t have values, but there’s a line that we can’t cross, which is deciding that a specific piece of information–be it news, political, religious, etc.—is something we should be promoting. It’s just a very, very slippery slope that I think we have to be very careful not go down.”

News Feed engineers stress that the their team doesn’t perform sentiment analysis—so the algorithm isn’t generally trying to find posts with a positive tone and place them in more people’s feeds. (The team that conducted the controversial mood study was part of a different department.) Still, some sociologists argue that the very way Facebook is structured incentivizes posting feel-good stories such as ice bucket videos rather than more challenging narratives like Ferguson.

“Things that are likable get more feedback,” says Zeynep Tufekci, a sociologist at the University of North Carolina who studies online interactions. “Even if you’re not explicitly told this, people over time will adapt their behavior. It’s like you’re rats in a maze, and if you go this way, you get cheese, and if you go that way, you don’t get cheese. By structuring the environment, Facebook is training people implicitly to behave in a particular way in that algorithmic environment.”

Facebook uses other indicators, such as number of comments, to surface posts that aren’t likely to yield a lot of “Likes.” The more recent addition of time reading a post as an algorithm factor should also help more complicated content rise to the top. But the social network has stopped short of adding a “sympathize” or “dislike” button, which Cox says would complexify Facebook’s interface.

Another issue with the algorithm, sociologists say, is that users aren’t explicitly informed of its existence, and those that do know about it don’t have a clear sense of how their actions on the site affect what they will see. In Karahalios’ study, many people voiced a common Facebook complaint: too many baby photos in their feeds. They said they would Like a friend’s baby picture out of a feeling of obligation, but then immediately hide the post to try to tell Facebook they didn’t actually want a feed full of toddlers. There’s no way for the average user to know how the social network interpreted those conflicting signals. “People like to feel empowered,” Karahalios says. “People felt some of the power was taken away from them and somebody else was making decisions that affected them.”

Facebook has addressed these concerns somewhat by making it easier for users to control which friends and Pages appear in a user’s feed most often. The company says its “Unfollow” button, which lets you remove a person from your News Feed without un-friending them, is heavily used. And the new “See First” feature gives users a simple way to automatically place up to 30 friends’ and Pages’ posts at the top of their feeds whenever they appear. A more streamlined settings menu will help users easily access all these features at once. “We want to give people controls that are simple and easy to use but have a powerful impact on their News Feed,” says Greg Marra, a product manager for News Feed.

Some users will no doubt continue to clamor for even more control. Tufekci suggests a simplified way to directly control how often the algorithm shows different types of content. (Marra says organizing controls around friends and Pages is easier for users to understand) Karahalios would like to see something in the actual design of the website—and other websites that prominently use algorithms—that makes it more clear to users that the content they see is being filtered. “We need to start getting some common languages, some norms, where people can understand what’s filtered and what’s not in more than just Facebook,” she says. “I think people have a right to know, for better or worse.”

[video id=ncI9ZLvX ]

Read next: Here’s Why Facebook Won’t Put Your News Feed in Chronological Order

Download TIME’s mobile app for iOS to have your world explained wherever you go

Tap to read full story

Your browser is out of date. Please update your browser at http://update.microsoft.com