Inside Facebook’s African Sweatshop

32 minute read
Updated: | Originally published:

In a drab office building near a slum on the outskirts of Nairobi, Kenya, nearly 200 young men and women from countries across Africa sit at desks glued to computer monitors, where they must watch videos of murders, rapes, suicides, and child sexual abuse.

These young Africans work for Sama, which calls itself an “ethical AI” outsourcing company and is headquartered in California.

Sama says its mission is to provide people in places like Nairobi with “dignified digital work.” Its executives can often be heard saying that the best way to help poor countries is to “give work, not aid.” The company claims to have helped lift more than 50,000 people in the developing world out of poverty.


More from TIME


This benevolent public image has won Sama data-labeling contracts with some of the largest companies in the world, including Google, Microsoft and Walmart. What the company doesn’t make public on its website is its relationship with its client Facebook.

Here in Nairobi, Sama employees who speak at least 11 African languages between them toil day and night, working as outsourced Facebook content moderators: the emergency first responders of social media. They perform the brutal task of viewing and removing illegal or banned content from Facebook before it is seen by the average user.

Since 2019, this Nairobi office block has been the epicenter of Facebook’s content moderation operation for the whole of Sub-Saharan Africa. Its remit includes Ethiopia, where Facebook is trying to prevent content on its platform contributing to incitement to violence in an escalating civil war.

Despite their importance to Facebook, the workers in this Nairobi office are among the lowest-paid workers for the platform anywhere in the world, with some of them taking home as little as $1.50 per hour, a TIME investigation found. The testimonies of Sama employees reveal a workplace culture characterized by mental trauma, intimidation, and alleged suppression of the right to unionize. The revelations raise serious questions about whether Facebook—which periodically sends its own employees to Nairobi to monitor Sama’s operations—is exploiting the very people upon whom it is depending to ensure its platform is safe in Ethiopia and across the continent. And just as Facebook needs them most, TIME can reveal that content moderators at Sama are leaving the company in droves due to poor pay and working conditions, with six Ethiopians resigning in a single week in January.

This story is based on interviews with more than a dozen current and former Sama employees, as well as hundreds of pages of documents including company emails, payslips and contracts. Most employees spoke on condition of anonymity for fear of legal consequences if they disclose the nature of their work or Facebook’s involvement. The Signals Network provided psychological and legal support for whistleblowers quoted in this story.

The Sama office in Nairobi, Kenya on Feb 10, 2022.
The office in Nairobi, Kenya, where Facebook content moderators began working in 2019, photographed on Feb 10, 2022. Sama was known publicly as Samasource until early 2021.Khadija Farah for TIME

“The work that we do is a kind of mental torture,” one employee, who currently works as a Facebook content moderator for Sama, told TIME. “Whatever I am living on is hand-to-mouth. I can’t save a cent. Sometimes I feel I want to resign. But then I ask myself: what will my baby eat?”

TIME is aware of at least two Sama content moderators who chose to resign after being diagnosed with mental illnesses including post-traumatic stress disorder (PTSD), anxiety, and depression. Many others described how they had been traumatized by the work but were unable to obtain formal diagnoses due to their inability to afford access to quality mental healthcare. Some described continuing with work despite trauma because they had no other options. While Sama employs wellness counselors to provide workers with on-site care in Nairobi, most of the content moderators TIME spoke to said they generally distrust the counselors. One former wellness counselor says that Sama managers regularly rejected counselors’ requests to let content moderators take “wellness breaks” during the day, because of the impact it would have on productivity.

Workers say Sama has also suppressed their efforts to secure better working conditions. In the summer of 2019, content moderators threatened to strike within seven days unless they were given better pay and working conditions. Instead of negotiating, Sama responded by flying two highly-paid executives from San Francisco to Nairobi to deal with the uprising. Within weeks Daniel Motaung, the attempted strike’s leader who was in the process of formally filing trade union papers, had been fired—accused by Sama of taking action that would put the relationship between the company and Facebook at “great risk.” Sama told other participants in the labor action effort that they were expendable and said they should either resign or get back to work, several employees told TIME. The workers stood down before the seven days were up, and there was no pay increase.

Are you a content moderator at Sama or elsewhere? TIME would like to speak with you. Please get in touch with the author: billy.perrigo@time.com

“At Sama, it feels like speaking the truth or standing up for your rights is a crime,” a second employee tells TIME. “They made sure by firing some people that this will not happen again. I feel like it’s modern slavery, like neo-colonialism.” (Sama disputes this characterization, and said in a statement that its content moderators are paid triple the Kenyan minimum wage.)

Foxglove, a legal NGO based in London, says it has informed Sama it is preparing legal action in relation to its alleged wrongful termination of Motaung. “Firing workers for trying to organize is against the law,” says Cori Crider, Foxglove’s director. “Daniel did a brave thing by blowing the whistle here—as was his legal right.” The Katiba Institute, a Kenyan public-interest law firm, is assisting with the case.

Sama denies that there was any strike or labor action. “We value our employees and are proud of the long-standing work we have done to create an ethical AI supply chain,” Shriram Natarajan, the head of Sama’s Nairobi office, said in an emailed statement. “We exist to provide ethical AI to our global customers and we are proud of the role our employees play in building new online experiences and cleaning up the internet. It’s a tough job and it’s why we invest heavily in training, personal development, wellness programs, and competitive salaries.”

Facebook says it spent more than $5 billion on safety measures in 2021. It contracts the services of more than 15,000 content moderators globally, most of whom are employed by third-parties like Sama. In response to a detailed set of questions for this story, a spokesperson for Facebook’s parent company Meta said: “We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading pay, benefits and support. We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect of them.”

A strike, struck down

Daniel Motaung was a 27-year-old university graduate from South Africa looking for his first job when he came across an online ad from Sama seeking Zulu speakers.

It was early 2019, and Sama had recently won a contract to provide content moderation for Facebook’s Sub-Saharan Africa markets. Sama placed job ads in countries across Africa, both directly and through agencies, looking for people with fluency in different African languages who were willing to relocate to Kenya.

Motaung, like many other moderators TIME spoke with, says he had little idea what content moderation involved when he applied for the job. He thought it simply involved removing false information from social media. He says he was not informed during his interview that the job would require regularly viewing disturbing content that could lead to mental health problems. After he accepted and arrived in Kenya, Sama asked him to sign a non-disclosure agreement, and only then did they reveal to him the type of content he would be working with daily. By then, he felt it was too late to turn back.

Several other current and former content moderators described similar experiences of not being warned about the nature of the job. Two, from separate countries, said they had answered job ads placed via agencies for “call center agents.”

Elsewhere in the world, similar working conditions have landed Facebook in hot water. In 2020, the social network paid out $52 million to fund mental health treatment for some of its American content moderators following a lawsuit centered on mental ill health stemming from their work, including PTSD. In the U.S. and Europe, many Facebook content moderators employed by the outsourcing firm Accenture are now asked to sign a waiver before they begin their jobs, acknowledging that they may develop PTSD and other mental health disorders. African content moderators working for Sama say they are not asked to sign such a waiver.

Read more: ‘I Sold My Soul.’ WhatsApp Content Moderators Review the Worst Material on the Internet. Now They’re Alleging Pay Discrimination

According to payslips seen by TIME, Sama pays foreign employees monthly pre-tax salaries of around 60,000 Kenyan shillings ($528), which includes a monthly bonus for relocating from elsewhere in Africa. After tax, this equates to around $440 per month, or a take-home wage of roughly $2.20 per hour, based on a 45-hour work week. Sama employees from within Kenya, who are not paid the monthly relocation bonus, receive a take-home wage equivalent to around $1.46 per hour after tax. In an interview with the BBC in 2018, Sama’s late founder Leila Janah attempted to justify the company’s levels of pay in the region. “One thing that’s critical in our line of work is to not pay wages that would distort local labor markets,” she said. “If we were to pay people substantially more than that, we would throw everything off.”

Employees themselves didn’t see it that way. One day in July 2019, Motaung got talking to a group of other moderators who were hired four months previously. He recounts that many said they felt the job that they had applied for was not the one they were doing, and discussed their low pay and poor working environment. Some said they had done research that showed content moderators in other countries were being paid far more for the same work. They resolved to group together and take action to better their conditions. Motaung took the lead, and he and his colleagues created a WhatsApp group chat to begin canvassing opinion more widely. Soon, the group had more than 100 members.

Employees who were in the group chat say they discussed the trauma of the work and how many felt they had been hired under false pretenses. Kenyan employees said it was unfair that they were paid around 30% less than the foreigners in the office, and that they had not yet received the medical insurance they say they had been promised. Other employees were frustrated that Sama had recently introduced compulsory night shifts to meet Facebook’s demand for 24-hour coverage.

Based on the discussions in the chat, Motaung drafted a petition with a list of demands for Sama’s management, including that everyone’s pay be doubled. The document, seen by TIME, said that if management did not “substantively engage” with the demands within seven days, employees would go on strike.

Motaung knew that Sama employees stood a better chance of having their demands met by acting as a group, because Sama was dependent on “the client”: Facebook. If they all stopped working at once, he reasoned, Facebook would want to know why so much of its African content was no longer being moderated, and could put significant pressure on Sama to accede to workers’ demands. “I explained that alone, all of us are expendable,” he recalls telling his colleagues. “But if we go as a group, remember, they have a contract with the client. The terms of the contract are that [Sama is] to deliver on a regular basis, no interruptions whatsoever. We knew if all of us stopped working right now, they would not be able to replace us within a week or within a month.”

Nairobi, Kenya- Feb 10, 2022. Khadija Farah for TIME.
Kibera, the largest informal settlement in Africa, out of which Sama says many of its workers are hired.Khadija Farah for TIME.

The Alliance, as the group of employees began calling themselves, presented their petition to Sama’s management in a meeting on July 30, 2019. Two senior Sama executives from San Francisco joined via video-conference, but they dismissed the workers’ concerns, according to Motaung and others. “They told us there are lots of people who are dying to get this job, that they did research on the wages and this is a nice wage considering what people are getting in Kenya,” says one employee who was present during the meeting. A 2021 study carried out by three MIT researchers found the average salary at Sama including benefits was approximately 2.5 times the Kenyan minimum wage. But even so, these wages only cover the basic costs of living, workers say, and don’t allow them to save or improve their financial situations.

Within days, the two executives from San Francisco had arrived in Nairobi, and Motaung was suspended from his job pending a disciplinary hearing. Sama told Motaung some of his colleagues had accused him of bullying, intimidating and coercing them to sign their names to the list of demands. He was told to stay away from the office and barred from talking to his colleagues. Motaung says the allegations that he bullied more than 100 of his colleagues into signing a petition for better pay and working conditions are ridiculous. He suspects that Sama intimidated several of his former colleagues into making statements against him. “It was just them pretending to follow a process so that they can get rid of me quickly, so that everything can go back to normal,” he says. Sama did not comment on allegations of worker intimidation.

Meanwhile, other employees involved in the attempted strike action were being invited to individual meetings with Cindy Abramson, one of the executives who had flown in from San Francisco. Two employees who were particularly vocal during the worker revolt said that Abramson flattered them in these meetings, suggesting that they had leadership potential, and dangled the prospect of promotion if they could convince their colleagues to stand down.

Three rank-and-file participants in the labor action told TIME that during their own one-on-one meetings, Abramson, whose total compensation in 2018 was $194,390, according to Sama’s public filings, intimidated them into revoking their names from the petition, saying that they must choose between disaffiliating from the Alliance or losing their jobs. Her warnings were especially stark toward Kenyan employees, according to people with knowledge of the discussions. The Kenyans were reminded in the meetings that they were more easily replaceable than foreign employees, which many of them took as a threat of being fired if they did not stand down. Scared, many people started revoking their signatures from the petition. “They threatened us, and we backed down,” says one Kenyan employee, who reasoned that it was better to have a low-paying job than no job at all.

“There never was a strike or labor action,” Sama said in its statement to TIME. “Being a responsible employer, we wanted to see our team in person, meet with everyone face-to-face and address their concerns head-on. It’s why we flew members of our leadership team to our offices in Nairobi and it’s a decision we stand behind.” The statement also said that after employees asked for higher salaries, the company conducted a pay audit and found they were already being paid double the living wage for the region. Sama said it has since changed its onboarding processes to “be more transparent about what to expect and we intensified our onboarding program by developing new training modules to ensure team members were prepared on how to handle the functions of the role.” Abramson, who has since left Sama, declined to comment.

Two weeks after his suspension, Sama fired Motaung, claiming he was guilty of gross misconduct “for engaging in acts that could amount to bullying, harassment and coercion and that led to the disruption of business activities and put the relationship between Samasource and its client [Facebook] at great risk,” according to a termination letter dated August 20, 2019. (Sama was known publicly as Samasource until early 2021, when it changed its name as part of a transformation that also included switching from a non-profit organization to a business.) The letter also noted Motaung’s leadership role within the Alliance, and said that he had advised his colleagues not to attend one-on-one meetings with management. Sama did not respond to questions about its firing of Motaung, but said in its statement that it had dismissed three employees who had “violated workplace rules.”

In the days before he was terminated, Motaung was busy drafting documents that would have formally established the Alliance as a union under Kenyan law. “I think they found out. While I was doing that, I received a letter terminating my employment,” he says. Motaung’s work permit was then canceled, leaving him just three weeks to leave Kenya for his native South Africa.

Kenyan labor law says employees are protected from dismissal as a result of “past, present or anticipated trade union membership,” and the Kenyan constitution says every worker has the right to go on strike.

Before he left Kenya, Motaung says, he handed the union incorporation papers to another employee in the movement. But the resolve of the Alliance had been broken, and the union never materialized. “We were in shock, devastated, broken,” one employee said. “And then life continued. After that, nobody dared to speak about it.”

Daniel Motaung
Daniel MotaungAart Verrips for TIME
Jason White
Jason WhiteAart Verrips for TIME

For a time, however, a spark of resistance remained. Jason White, a former Afrikaans quality analyst from South Africa, says Sama fired him around a year later, in the summer of 2020. He had been a participant in the Alliance, and continued to ask questions even after most of his colleagues had given up. He says he regularly asked managers whether Sama was deducting too much tax from employees’ payslips, and why his girlfriend, also an employee with the company at the time, was not provided a work permit despite being promised one.

In July 2020, White and a colleague took their concerns to the South African embassy in Nairobi. In a series of emails, reviewed by TIME, and then at a meeting in person, the pair informed South African officials about the thwarted strike and Motaung’s firing the previous year, and how some of their colleagues believed they had been hired under false pretenses. The officials promised to investigate, but never followed up. The embassy did not respond to a request for comment.

After that, “there was a definite change in behavior from [Sama’s] top management,” White says. Soon after, a Sama manager offered him a payment equivalent to two months’ salary on the condition that he stop mentioning the pay and conditions at Sama to anybody, he says. He declined.

Then, White says, he was called into a disciplinary hearing, charged with having unauthorized contact with a Facebook employee, forbidden under his employment contract. White says he believes this was a reference to an email that he had sent to a Facebook staffer who had previously visited the Nairobi office, in which he had revealed his pay to her and asked her whether she believed Sama was exploiting him and other employees. Although he never received a reply, he believes that the employee told Facebook, who informed Sama. “The company saw that as a good opportunity to use that against me,” he says. Sama did not address TIME’s questions about White’s dismissal or the alleged payment offer extended to him. A Facebook spokesperson said that the email’s recipient had followed protocol.

A system driven by Facebook

Once every few months, Facebook employees travel from Dublin to Nairobi to lead trainings, brief content moderators on new policies, and answer questions. Five content moderators said that ahead of these meetings, Sama managers regularly instruct workers not to discuss their pay with Facebook staff.

But satisfying Facebook is at the center of the work culture Sama has created.

When Idris (one of the content moderators cited above who asked to use a pseudonym out of fear for his personal safety and job prospects) arrives at Sama’s office in Sameer Business Park each day, he logs into a piece of software designed by Facebook.

As soon as Idris looks at his first piece of content, a clock starts ticking. He might be confronted with graphic images or videos depicting dismemberment, murder or rape. No matter how disturbing the content, Idris must make a decision within 50 seconds about whether to take down or leave up this material—a target laid down by his Sama bosses. Every week, his average handling time (AHT) is measured against the 50 second target during a formal review process. (The target can rise as high as 70 seconds, or sink as low as 36, depending on workload and staffing, according to employees.) If Idris takes too long reviewing each piece of content, he might be reprimanded by his team leader. If the problem persists, he will be pulled off working for Sama’s contract with Facebook and put on an internal training program instead. If he still does not work fast enough, he believes, he could be fired.

Moderators like Idris are expected to maintain an AHT of around 50 seconds, regardless of whether the video they are reviewing is minutes or even hours long. Facebook guidelines seen by TIME—previously unreported—instruct content moderators to watch only the first 15 seconds of a video before marking it as OK to remain on the platform and moving onto the next piece of content—as long as the title, transcript, top comments and thumbnail of the video appear to be innocent, and no users nor Facebook’s AI systems have flagged specific points in the video.

Through its prioritization of speed and efficiency above all else, this policy might explain why videos containing hate speech and incitement to violence have remained on Facebook’s platform in Ethiopia.

TIME reviewed several examples of lengthy Facebook Live videos that contained hate speech and incitement to violence well into the body of the videos. In one example, a two-hour long video with more than 1,000 views, a man speaking Amharic says that anyone married to Tigrayans or Oromos—two major ethnic groups from Ethiopia—are traitors and enemies of the state. In a coded call for ethnic cleansing, the man says that Amharas must not live in an Ethiopia that contains members of those groups. An Ethiopian digital rights group, Network Against Hate Speech, told TIME in March 2021 that it had reported this video and more than 70 others to Facebook. But this video was not taken down until at least three months later. “The majority of posts that have taken [a] long [time] before removal … are videos,” Network Against Hate Speech told TIME in an email last year. In the months since, the Ethiopian civil war has only escalated, with reports of mass atrocities and murders of civilians along ethnic lines.

Facebook whistleblower Frances Haugen said in an interview that the working conditions described by content moderators at Sama appear to have had a serious impact on Facebook’s ability to police content in Ethiopia. “I am entirely unsurprised that these moderators are not being treated with the dignity that they deserve,” she told TIME. “It is tragic that the consequence of this devaluing of human beings is that others, in some of the most vulnerable places in the world, are now suffering as well.”

Adding to the pressure they feel, Idris and other Sama content moderators are also expected to make the correct call at least 84% of the time: a target known as “quality score,” which team leaders track each week and use to measure what they deem to be underperformance. Some pieces of content are reviewed twice, first by Idris, and then by a more senior “quality analyst,” who is generally a former content moderator with good knowledge of Facebook’s policies. If the pair disagree on whether a piece of content should be permitted to remain on the platform, the quality analyst’s decision is taken as final, and Idris’s quality score ticks downward. Even if they agree a piece of content should be removed, they must also agree on the reason for removal.

Facebook has put some features in place to help protect moderators, like the option to render videos in black and white or add blurring. But one Sama employee says that he does not use these options because of the pressure to meet quotas. “How can you clearly see whether content is violating or not unless you can see it clearly? If it’s black and white or blurred, I cannot tell,” the employee says. “Some people use the option, but I don’t. Because if I wrongly action that [content], I will be marked down.”

Employees say they are expected to work for up to nine hours per day including breaks, and their screen time is monitored. “I cannot blink,” one employee says. “They expect me to work each and every second.” In a statement, Sama said that it caps working hours for content moderators at 37 hours per week, however TIME reviewed an employment contract from 2019 that said workers can be expected to work for up to 45 hours per week without additional compensation. It is unclear whether that includes breaks.

TIME reviewed several copies of Sama content moderators’ performance reviews, where they were measured against target metrics for AHT and quality. In one email, a manager chastises a content moderator for spending too much time not moderating content while logged in.

Employees say that on a typical working day, they are expected to spend around eight hours logged into Facebook’s content moderation program. On such a day, a target of 50 seconds per piece of content would equate to a de-facto daily quota of nearly 580 items.

This evidence appears to contradict public statements that Facebook has made in the past about expectations it places on its contractors. “A common misconception about content reviewers is that they’re driven by quotas and pressured to make hasty decisions,” Ellen Silver, Facebook’s vice president of operations, said in a 2018 blog post. “Let me be clear: content reviewers aren’t required to evaluate any set number of posts … We encourage reviewers to take the time they need.”

A Meta spokesperson, Ben Walters, said Meta asks contractors like Sama to encourage moderators to take as much time as they need to make decisions. The video guidelines, he said, were designed to allow content moderators to use their best judgment so as to avoid wasting time on long videos that do not appear to contain policy violations. Sama did not respond to questions about its workflows or targets.

Idris says that while Sama management, not Facebook, are the ones pressuring content moderators over their metrics, he thinks it is clear that they do so because of anxieties about what it would mean if the company did not meet Facebook’s expectations. “That is always their excuse: the client has seen your AHT is high,’” Idris says, referring to times he is put under pressure by his manager. A former employee adds: “They only care about pleasing the client.”

Content moderators at Sama are meant to receive “wellness breaks” of at least an hour per week, to help them deal with seeing traumatizing content. But some employees described having to “beg” to be allowed to take their allotted wellness breaks. A former counselor said that Sama managers, not counselors, had the final say over when and whether content moderators were allowed to take breaks. The counselor witnessed managers repeatedly rejecting content moderators’ requests for breaks, citing productivity pressures. “There is a clinical responsibility in our job to ensure that the moderators are cared for,” said the former counselor, who asked not to be named. “This responsibility is not fully being fulfilled. Sama is more interested in productivity than the safety of the moderator.”

Enter the virus

When Kenya went into COVID lockdown in March of 2020, Sama announced that it would hire out two luxury lodges in the countryside for its employees to relocate to, in order that they could continue their work uninterrupted. Many took up the offer, which was optional, and left their families behind. Dozens of Sama’s Facebook content moderators moved into the Lukenya Getaway resort a short drive away from Nairobi, where rooms typically cost more per night than an average Kenyan content moderator at Sama makes in a week.

The arrangement lasted until the end of that summer, along with the company’s generosity. Workers were then expected to work from home, but Sama only contributed $30 toward the cost of setting each employee up with a desk and chair in their homes. The company circulated a Google Form for employees who wanted alternative furniture, and offered to deduct the remaining cost (between $22 and $57) from employees’ salaries over the next three months.

Less than two miles from Sama's office is Mukuru kwa Njenga, one of the largest informal settlements in Nairobi.
Less than two miles from Sama's office is Mukuru kwa Njenga, one of the largest informal settlements in Nairobi.Khadija Farah for TIME

In November 2021, Sama finally reopened the doors to its office near the Nairobi slum, Mukuru kwa Njenga. Employees, many of whom had not received a single COVID-19 vaccine, say they were pressured to return to the building even as cases of the Omicron variant rose in the city. Some attributed the decision to a belief among managers that content moderators’ quality scores and Average Handling Times would improve if they were physically present. (Several employees said the return to the office had had the opposite effect.)

Soon there was an outbreak of COVID in the office, and many content moderators started requesting sick leave. Instead of granting all of them, a Sama HR manager informed employees via text message that they would not be granted sick leave unless they received a note from one of two specific hospitals. As a result, some employees who couldn’t access those hospitals went to the office even if they had symptoms, two employees said.

In its statement, Sama said it had offered unlimited COVID-related sick days in response to the pandemic. “As we reopened our offices, we’ve worked to keep our team safe by following CDC guidelines, having nurses on site, utilizing 20% of the office space to encourage social distancing, and we’ve hosted three vaccination drives to make it as easy as possible to get vaccinated if employees choose,” the company said.

Outsourcing trauma to the developing world

In an era where Facebook has come under sustained fire for failing to stem the flow of misinformation, hate speech and incitement to violence on its platforms, the company is often praised when it says it is increasing the number of dollars it spends on safety.

But hiring content moderators in the U.S. and Europe is expensive compared to the cheap labor available in Kenya and other countries in the Global South like India and the Philippines. The rise of content moderation centers in these countries has led some observers to raise concerns that Facebook is profiting from exporting trauma along old colonial axes of power, away from the U.S. and Europe and toward the developing world.

“Outsourcing is a scam that lets Facebook rake in billions while pretending worker exploitation and union-busting is somebody else’s fault,” says Crider, the Foxglove lawyer who is currently preparing a legal case against Sama. “But it’s not just Sama,” she added. “Foxglove has been working with Facebook moderators around the world for years – and these people have had it with exploitation, the strain of toxic content, and suppression of their right to unionize.”

Almost all of the employees TIME spoke to for this story described being profoundly emotionally affected by the content they were exposed to at Sama – trauma that they said was often exacerbated by the way they have been treated in their jobs. Many expressed the opinion that they might be able to handle the trauma of the job – even take pride that they were sacrificing their own mental health to keep other people safe on social media – if only Sama and Facebook would treat them with respect, and pay them a salary that factors in their lasting trauma.

In its statement to TIME, Sama said it had “revisited” its mental health processes after employees raised concerns in 2019 “and made further enhancements, and provided additional coaching to team leads.” But employees say the protections remain inadequate to this day. “When it comes to your personal welfare,” one employee says, “You are not treated like a real human.”

After returning to South Africa after being fired, Motaung, the leader of the failed 2019 strike, says it felt like everything around him crashed. He went to Pretoria to look for work, but struggled. He lost a lot of weight. “I was not OK mentally, emotionally,” he says. He eventually returned to a village in the mountains where he has family. “When I got home, they were like, what happened to you? What were you doing in Kenya? I couldn’t even talk about it because I signed an NDA.”

Sama extended Motaung access to its wellness counselors for one extra month after his departure, as it does for all outgoing employees. But Motaung didn’t take up the offer. He had attended wellness sessions around once a week when he was in Kenya, but found them unhelpful. “Those people …It was sort of like we were there for their entertainment, or there for them simply to get paid,” he says. “Whereas the help that we really needed, we were not getting.”

Motaung says he is still dealing with the trauma he incurred at Sama, but is unable to afford a therapist. “If you do this work, it’s very hard not to experience permanent scars to your emotions and mental state,” he says.

In conversation, Motaung still avoids any specific mention of what he saw during his work, conscious that he is still bound by the NDA. What he will say is that he had a traumatic experience, and that he still gets flashbacks. He expects to carry the burden of that trauma with him until the day he dies. “That sort of thing can change who you are,” he says. “It can destroy the fiber of your entire being.”

—With reporting by Mengistu Assefa Dadi/Addis Ababa and Eloise Barry/London

Do you have additional knowledge of any of the matters discussed in this story? TIME would like to hear from you. Please contact the author: billy.perrigo@time.com

 

Correction, Feb. 15, 2022:


The original version of this story stated that many Sama employees returned to the office without having been able to obtain a COVID-19 vaccine. Some were unvaccinated when they returned to the office, but Sama had made vaccines available to them on an optional basis. The original version of this story also stated that employees were informed in an email that they would be expected to cover between $22 and $57 to cover the remaining cost of work-from-home equipment, on top of Sama’s $30 contribution. The higher costs were laid out in a Google Form, not an email, and they were for employees who wanted different furniture to a basic option worth $30 offered by Sama.

Update, Feb. 15, 2022:


In an email sent after the publication of this story, a Meta spokesperson disputed the characterization of Facebook exporting trauma to developing countries, and provided the additional statement: “We have over 20 sites around the world including Germany, Spain, Ireland, and the United States. Having sites around the world allows us to respond to reports 24/7 and helps ensure we have content reviewers who are native language speakers and who know and understand local culture, as we recognise it often takes a local to understand the specific meaning of a word or the political climate in which a post is shared.”

In the same email, the spokesperson asked TIME to clarify that he said Facebook’s video policy was designed to allow content moderators to avoid wasting time specifically on long videos that did not appear to violate any of Facebook’s policies, not long videos in general.

More Must-Reads from TIME

Write to Billy Perrigo at billy.perrigo@time.com