• Business
  • facebook

‘I Sold My Soul.’ WhatsApp Content Moderators Review the Worst Material on the Internet. Now They’re Alleging Pay Discrimination

12 minute read
Updated: | Originally published:

When Alex accepted a job interview at Accenture, the global professional services giant that performs content moderation for WhatsApp, one of the first questions the recruiter asked him was whether he was comfortable looking at child sexual abuse, gore, pornography and bestiality.

Alex, who spoke to TIME under a pseudonym because of fear of reprisals for breaking a non-disclosure agreement, says that he assumed that the recruiter only meant that he might see that kind of content on occasion, and agreed to take the job as a Spanish-speaking content moderator because he was unemployed and needed the money.

Soon he found himself tasked with reviewing a queue of child abuse images for hours on end, sometimes hundreds in a day. “I was haunted by the things that I saw,” he says. “Sometimes I’d close my eyes and the images would be burned there and I wouldn’t be able to avoid them. There were a lot of nights that I would come home and wouldn’t be able to sleep.”

“I sold my soul for $16.50 an hour,” Alex says.

WhatsApp’s parent company, Facebook, is one of the highest-paying big tech companies in the world, with direct employees typically on six-figure salaries. But its army of more than 15,000 content moderators are paid substantially less for this psychologically taxing work. Most of them are employed not by Facebook, but through third parties like Accenture, often on temporary contracts.


More from TIME


In May 2020, Facebook agreed to pay $52 million to settle a class-action lawsuit brought by thousands of content moderators, who said that Facebook had failed to create a safe working environment for them, and that as a result they had developed post-traumatic stress disorder (PTSD).

But a new front has opened in the battle to get Facebook to recognize and properly compensate one group of workers who do vital, often brutal work for the company: bilingual content moderators like Alex, whose job is to review graphic images and videos in multiple languages from around the world. He is part of a group of Spanish-speaking employees in Accenture’s Austin, Texas, office who work on the firm’s WhatsApp contract, who are alleging that on top of the burden of their often-disturbing work, they suffer pay discrimination too.

On June 29, Alex signed an open letter alleging that Spanish-speaking employees at Accenture’s offices in Austin and Dublin, Ireland, are treated with “complete disregard.” The letter, first reported by The Verge, now has 129 signatories and alleges “rampant discrimination” by the two companies.

Workers interviewed by TIME, all of whom used pseudonyms for fear of losing their jobs, also describe working in a heavily controlled environment where safeguards are limited and employees are pressured even on how long they take in the bathroom.

An invisible class of workers

Content moderation bears surprising similarities to another industry: coal mining. In 1937, after spending months documenting the brutal conditions faced by coal miners in England, author George Orwell observed that the modern world could only exist because of the body-breaking work and the filth that was endured by the miners, working out of sight underground in dire conditions and for meager pay. “Probably a majority of people would prefer not to hear about it. Yet it is the absolute necessary counterpart of the world above,” Orwell wrote.

Content moderators are “the coal workers of Silicon Valley,” Alex says.

The work of content moderators keeps social media free, for the most part, of illegal and unsavory images, videos, and posts. Without content moderators, a visit to social media platforms could be a game of roulette, in which bad luck could leave you traumatized. Most users would probably think twice about returning to the platforms day after day, and companies would not pay large sums to advertise there.

“The platforms themselves have an incentive to keep content moderation invisible, because the reality of content moderation turns the story that they tell about themselves on its head,” says Cori Crider, the co-founder of Foxglove, a U.K. law firm that advocates for healthier working conditions for content moderators. “The story is that this is a magical piece of software made by a few geniuses in Silicon Valley. The reality is that this is a system of labor that stands on the backs of tens of thousands of people globally—and without those people, the service literally wouldn’t exist.”

Facebook says its artificial intelligence (AI) is becoming increasingly adept at detecting and removing harmful content automatically. But human moderators must still train its algorithms by manually identifying content that breaks the rules. “That’s how the AI learns,” Alex says. “We train it, and then it eventually takes over our jobs.”

Read more: Facebook Says It’s Removing More Hate Speech Than Ever Before. But There’s a Catch

Content moderators working at Accenture for WhatsApp are entitled to 30 minutes of “wellness” breaks per eight-hour shift, employees say, as well as one-on-one access once a month to “wellness coaches” provided by the company. They also undergo training to prepare them for the kinds of content they are expected to look at on the job. But even Accenture acknowledges that the help these resources provide is limited. Moderators working on a Facebook contract for Accenture in Europe were required to sign a document in January 2020 acknowledging “that the wellness coach is not a medical doctor and cannot diagnose or treat mental health disorders.”

TIME viewed a copy of the document, which was first reported by the Financial Times. It requires employees to acknowledge that “the weCare Program [wellness] services, standing alone, may not be able to prevent my work from affecting my mental health.” It also notes that the work “could even lead to Post Traumatic Stress Disorder.” (Facebook told the Financial Times it did not review or approve the document, and was not aware of it.)

The starting wage for single-language content moderators who do this work has risen since Alex was hired, from $16.50 to $18 per hour. The letter signed by WhatsApp moderators alleges that Accenture has a policy of paying a $2 per hour premium to bilingual workers, who are able to review foreign-language content—making their wage $20 per hour—but that this premium is not extended to employees who speak Spanish as their second language.

In a video recording of a June 24 meeting about the pay dispute, viewed by TIME, John Arnold, the managing director of Accenture’s Austin office, said that Spanish-speaking employees do not receive the $2 per hour wage increase afforded to other bilingual workers because Spanish is “not a premium language for the Austin market.”

“They talk about us like we’re a dime a dozen, that they can go outside and pick up five random people on the street who speak Spanish,” says Alejandra, who works for Accenture in Austin.

In a statement to TIME, an Accenture spokesperson did not respond directly to questions about Arnold’s comments, but said: “We strive to ensure that all our people are compensated fairly and equitably with market relevant pay. We have an unwavering commitment to equality and zero tolerance for racism, bigotry and hate of any kind.”

‘I couldn’t stop crying’

Facebook is open about its use of human content moderators for its main platform and Instagram, but a WhatsApp spokesperson said in an email to TIME that it would be inaccurate to describe the work of Accenture employees as “content moderation,” because the app is end-to-end encrypted.

“We provide the option for users to make reports to WhatsApp and we encourage people to do so,” a WhatsApp spokesperson said in a statement to TIME. “We may ban users based on these reports such as when we believe users are involved in sharing child exploitative imagery. Given the nature of private messaging, however, we do not moderate the conversations people have on our service.”

Accenture employees said that the work is sometimes referred to internally as content moderation, and that their jobs bear many similarities to content moderators working on Facebook and Instagram—including entitlement to wellness time. Many of the problems faced by employees at Accenture are common to content moderation work for all platforms. But there is one problem that appears to be unique to WhatsApp, stemming from the challenges that arise from moderating a peer-to-peer messaging service as opposed to an open platform.

As well as content moderators, Accenture also employs “customer service agents” whose job is to respond to customer questions. But sometimes the job also involves looking at content that has been incorrectly categorized by WhatsApp’s systems after users report it, according to two employees.

Two customer service agents who signed the open letter say that they had come across child sexual abuse imagery and other harmful content during the course of their work. Unlike the content moderators, the customer service agents tasked with sending this imagery to the correct moderation teams do not receive training to prepare them for seeing graphic content. And they are not guaranteed “wellness time” or access to wellness coaches, employees said.

whatsapp
The WhatsApp logoRafael Henrique/SOPA Images/LightRocket via Getty Images

“To me, the counseling wasn’t even mentioned until I came across my first piece of child abuse content,” one of the customer service employees, Gloria, tells TIME. At that point, already traumatized, she says she was not given information about how to access wellness resources, although her supervisor told her that counseling could be arranged if she thought she needed it. Gloria says she did not end up pursuing counseling because she felt that revisiting the subject would re-traumatize her.

Alejandra, who is also a customer service agent, says that in addition to coming across child abuse content she was exposed to a graphic video of sexual violence, which triggered an episode of a pre-existing psychological condition. “I was not prepared for anything like that,” Alejandra says. “I couldn’t stop crying. I still have nightmares about it.”

In that instance, Alejandra was given access to counselors. But after the incident, instead of providing training and guaranteeing wellness time to customer service agents, Accenture bosses told them not to open pictures or videos unless they needed to, according to Alejandra.

“The vast majority of the vital work by this customer support team is focused on responding to user questions about how our features work in many languages,” a WhatsApp spokesperson said in a statement. “We require our partners to provide the highest standard of care to this important team that helps make WhatsApp safer for everyone.”

In a separate statement to TIME, the Accenture spokesperson said: “The safety and wellbeing of our people are always a top priority. All of our people, including customer support team members, have access to a suite of well-being services and can use them at any time. These include resiliency training, well-being digital apps and EAP [employee assistance program] support.”

Accenture customer service agents say they also suffer from micromanagement and employment insecurity. If they are inactive for more than eight minutes, they are logged out of their workstations and marked “unavailable,” employees told TIME. The Accenture spokesperson declined to comment about these claims.

“They have been bringing it up in [Accenture] town halls that Facebook isn’t happy with our unavailability,” Alejandra says. “I’m always scared that they’re going to use it as an excuse to fire me.”

“As a woman, we have our menstrual cycles,” Gloria says. “It can take a little over eight minutes to go to the restroom.”

The starting pay for customer service agents is also lower: $16.50 per hour without the $2 bilingual bonus. Alejandra, who helps customers in multiple languages, says she has to work a second job to make ends meet for her family.

Calls for regulation

The Accenture employees who spoke with TIME said that, as well as seeking better pay, they hope speaking out will add volume to calls for more stringent protections for content moderators and customer support agents.

“It’s good to have the wellness coaches, but it’s grossly inadequate,” Alex says. “In other industries where you have exposure to hazards, they’ve developed baseline understandings for how much is too much. But there’s no discussions over content moderation and what type of impact that has on the long-term mental health of a person.”

Current workplace safety regulations in the U.S. were written when the job of content moderation didn’t exist. The Occupational Safety and Health Administration (OSHA), the body responsible for workplace safety, is largely focused on physical hazards. It has no binding guidance relating to the mental health risks of content moderation, according to Jacki Silbermann, who studied the employment law of content moderation while a member of Harvard Law School’s Labor and Employment Lab.

“OSHA still doesn’t really have hard and fast rules of how a workplace should be run to mitigate mental health issues stemming from the work itself,” Silbermann says. “When you don’t have those types of rules, workplace protections are obviously quite weak, if they exist at all.”

Meanwhile at Accenture, Alejandra signed the open letter seeking a pay increase so that she doesn’t need to work two jobs to pay the bills. “It’s almost like an abusive relationship,” she says. “They tell you how much they need you, how much you mean to them, and then when the time comes they slam the door in your face.”

Correction: Sept. 23, 2021

The original version of this story misstated the number of content moderators that Facebook employs globally. It is more than 15,000, not more than 35,000.

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com