Content warning: This story contains description of extreme and disturbing violence, suicide, child abuse and cruelty to animals.
Luis, a 28-year-old student from Colombia, works through the night moderating videos for TikTok. During the day, he tries to get some sleep, but sometimes the videos haunt his dreams.
He remembers one video taken at a party, with two people holding what initially looked to him like pieces of meat. When they turned around, it appeared they were holding skin and gristle which had been flayed off human faces. “The worst thing was that the friends were playing games and started using the human faces as masks,” he says.
Luis reeled off a list of the kind of content he sees on a regular basis: “Murder, suicide, pedophilic, pornographic content, accidents, cannibalism.”
For Carlos, a former TikTok moderator, it was a video of child sexual abuse that gave him nightmares. The video showed a girl of five or six years old, he says. “She was dancing, like pointing her back to the camera, it was so close.”
It hit him particularly hard, he says, because he’s a father himself. He hit pause, went outside for a cigarette, then returned to the queue of videos a few minutes later.
Horrific videos such as these are part and parcel of everyday work for TikTok moderators in Colombia. They told the Bureau of Investigative Journalism about widespread occupational trauma and inadequate psychological support, demanding or impossible performance targets, punitive salary deductions and extensive surveillance. Their attempts to unionize to secure better conditions have been opposed repeatedly.
TikTok’s rapid growth in Latin America—it has an estimated 100 million users in the region—has led to the hiring of hundreds of moderators in Colombia to fight a never-ending battle against disturbing content. They work six days a week on day and night shifts, with some paid as little as 1.2 million pesos ($254) a month, compared to around $2,900 for content moderators based in the U.S.
The workers interviewed by the Bureau worked on TikTok content, but were contracted through Teleperformance, a multinational services outsourcing company that has more than 42,000 workers in Colombia, making it one of the country’s largest private employers. The nine moderators could only speak anonymously for fear they might lose their jobs, or undermine their future employment prospects.
Neither TikTok nor Teleperformance responded to detailed lists of allegations for this story. Both companies issued statements saying they are committed to the wellbeing of their employees.
Human Labor: Cheaper Than AI
TikTok’s recommendation algorithm is widely considered to be one of the most effective applications of artificial intelligence (AI) in the world. With almost alarming accuracy, it learns what an individual user finds funny or appealing, and serves them more content that they are likely to enjoy.
But TikTok’s AI expertise only goes so far. The company uses human workers alongside AI to help keep its platform scrubbed of harmful content. And when content moderators at TikTok and other platforms mark a piece of content for removal, they are not just taking it down. They are also collecting data about the specific policies it violates—data that can be used to train the platform’s machine learning systems to better identify such content in the future.
Some social media platforms struggle with even relatively simple tasks, such as detecting copies of terrorist videos that have already been removed. But their task becomes even harder when they are asked to quickly remove content that nobody has seen before. “The human brain is the most effective tool to identify toxic material,” says Roi Carthy, the chief marketing officer of L1ght, a content moderation AI company. Humans become especially useful when harmful content is delivered in new formats and contexts that AI may not identify.
“There’s nobody that knows how to solve content moderation holistically, period,” Carthy says. “There’s no such thing.”
Read More: Inside Facebook’s African Sweatshop
The existence of a low-paid, insecure global workforce may be exacerbating the problem, Carthy says. More computing power is required to analyze videos, which are more complex than photos and text. This means creating AI for moderating video is especially expensive.
“If you’re looking at this from a monetary perspective, then content moderation AI can’t compete with $1.80 an hour,” Carthy says, referring to a typical wage for content moderators based in the global south. “If that’s the only dimension you’re looking at, then no content moderation AI company can compete with that.”
Psychological support ‘just for show’
Claudia, a current TikTok moderator, told the Bureau she felt anxious and panicked at work after watching successive videos of people eating live animals. These trending videos were impossible to escape and triggered a phobia of hers. “I would just cover my screen and wait for 10 seconds to pass,” she says.
Claudia requested support through the Teleperformance scheme, which had to be approved by a supervisor, but she did not receive any help for two months. When the company’s mental health support staff finally did get in touch, they said they were unable to help her and told her to seek out support through the Colombian healthcare system.
The mental harms of content moderation work are well-documented, and the moderators the Bureau spoke to reported many symptoms caused or exacerbated by their work, including depression, anxiety, loneliness, tremors and sleep loss.
A slick PR video produced by Teleperformance promotes the psychological support it offers. “Some call them angels,” the voiceover says, describing its moderators while vaguely melancholic classical music plays in the background. “We know them as our friends, members of our family.”
However, several moderators say the company’s mental health support scheme is woefully inadequate.
Just one of those interviewed, a former moderator, felt Teleperformance did care about the mental health of its employees. Daniela said the company was particularly concerned about those who, like her, worked on the R1 team dealing with the most extreme content – although she personally had never felt the need to use the support.
But Luis, the man haunted by the video of flayed faces, says the in-house offering was “just for show”. He also works on R1, but ended up seeking psychological support outside of work, through the Colombian healthcare system.
Missing the metric
The TikTok moderation system described by these moderators is built on exacting performance targets. If workers do not get through a huge number of videos, or return late from a break, they can lose out on a monthly bonus worth up to a quarter of their salary.
It is easy to lose out on the much-needed extra cash. Álvaro, a current TikTok moderator, has a target of 900 videos per day, with about 15 seconds to view each video. He works from 6am to 3pm, with two hours of break time, and his base salary is 1.2m pesos ($254) a month, only slightly higher than Colombia’s minimum salary. All the moderators interviewed by the Bureau were based in the capital Bogotá when they were working for TikTok, where rent and living expenses are above the national average.
For Álvaro, hitting his productivity, timekeeping, and accuracy targets can be worth an extra 300,000 pesos ($64). But he says he usually earns closer to his base salary of 1.2m pesos.
A single slip up can be enough. He once received a disciplinary notice known internally as an “action form” for only managing to watch 700 videos in a shift, which was considered “work avoidance”. Once a worker has an action form, he says, they cannot receive a bonus that month.
Álvaro worries these disciplinary actions will affect any future reference and hamper his job prospects. “If I put this work in my work experience, they’re gonna say I was avoiding work, not getting the metric, being a bad worker.”
He adds: “You have to just work like a computer. You pick the policies, no more. Don’t say anything, don’t go to bed, don’t go to the restroom, don’t make a coffee, nothing.”
Other moderators who spoke to the Bureau also had daily targets of 900-1,000 videos, while another who worked on longer videos of up to a minute reported targets of 200-250 per shift.
While some say they usually received their bonus, others say they only received it some of the time, and one moderator described it as impossible to attain. Two say they felt they had unfairly missed out despite meeting their targets or being a top performer in their department.
Teleperformance spokesperson Mark Pfeiffer says: “Teleperformance is committed to employee wellbeing and staff diversity, equity and inclusion. People care has been, is and will continue to be a global top priority for our business.”
Home surveillance
Teleperformance, based in Paris, has become a market leader in content moderation services, with more than 7,000 of these types of workers around the world, according to analysis by Market Research Future. It sees Colombia – where moderators told the Bureau it worked with Meta, Discord and Microsoft as well as TikTok – as a key hub for cementing this position of dominance. Teleperformance documents name Colombia as one of two major hubs for content moderation in Latin America, the other being Brazil.
During the pandemic, business was booming. Last year, Teleperformance reported record revenue of €7.1 billion ($8.1 billion) and €557 million ($620m) in profit.
The widespread shift to working from home was partly behind this huge rise in profits, as companies contracted Teleperformance to help them manage their newly remote workforce, according to Aarti Dhapte, a senior analyst with Market Research Future.
At the same time, a large proportion of Teleperformance’s own staff were shifted to remote work, cutting the cost of office maintenance. Teleperformance says that 70% of its global workforce is now remote, as were most of the moderators interviewed by the Bureau.
Read More: These TikTok Creators Say They’re Still Being Suppressed for Posting Black Lives Matter Content
To reassure clients that this shift would not lead to a drop in standards, Teleperformance rolled out extensive surveillance systems, using both proprietary and third-party software to monitor its employees. NBC news revealed last year that Teleperformance workers in Colombia, including those subcontracted to Apple and Uber, had been pressured to sign contracts giving the company the right to install cameras in their homes.
Carolina, a former TikTok moderator who worked remotely for Teleperformance between June and September 2020, says supervisors asked her to be on camera continuously during her night shift. She was also warned that nobody else should be in view of the camera, and that her desk should be empty, apart from a drink in a transparent cup.
“That was terrible, because my family lives in my house as well,” Carolina says. “So I felt very guilty telling them, ‘Please don’t pass behind the camera because I could be fired’. Teleperformance is especially paranoid with people seeing what we do.”
Current moderators did not say they had to work on camera, but they do have to clock in and out and log any breaks on an app called Timekeeper. When Álvaro was once a few minutes back from the break on his overnight shift, his supervisor subsequently contacted him and put him on an action form.
A TikTok spokesperson says: “We strive to promote a caring working environment for our employees and contractors. Our Trust and Safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”
Union busting
Outsourcing moderation to countries in the global south like Colombia works for businesses because it is cheap, and workers are poorly protected. Carolina, who left Teleperformance two years ago, says: “It’s very important for people to get to know that this is abusive, and they’re just companies taking advantage of the terrible economical situation that we’ve got for young people in Latin America.”
Utraclaro, a Colombian union which represents IT and call center workers, has been trying to organize Teleperformance staff – including TikTok moderators – for more than two years. It has been a slow and laborious process, with pushback from the company at every stage.
The union’s key demands are that the company must allow workers to form a union without intimidation and that union reps are allowed access to the workplace to talk to their colleagues.
Teleperformance did not comment on specific allegations of union-busting in Colombia.
Eyewitnesses who asked to remain anonymous said they had seen union organizers harassed several times by security staff while trying to speak to Teleperformance workers at a business park in Bogotá.
When organizers gave out flyers or attempted to speak to workers taking their breaks outside the Teleperformance offices, the park’s private security guards followed them and told them to stop. One person said she had seen a Teleperformance worker watch the organizers, and then speak with security guards. After this, more security guards were watching the workers.
On another occasion, the guards called the police after arguing with the organizers about their right to be there.
“Even if you’re in a public area they don’t care, they will harass you, they’ll call the police,” said another source. “And because it’s a big company, the police also will be in their favor.”
After Utraclaro officially notified Teleperformance of its demands in August 2021, Teleperformance filed a legal claim at a labor court in Bogotá, alleging Utraclaro did not follow the proper processes when ratifying its bargaining demands.
The lawsuit is an attempt to intimidate workers, said UNI Global Union, an international trade union federation which is supporting Utraclaro in their dispute.
Teleperformance’s lawyers have sought the “suspension, dissolution, liquidation and cancellation of the union registration,” public court records show.
Read More: After a Year of Focus on Big Tech’s Harms, Why We’re Still Waiting on Reform
Change, however, may be on the horizon. There has been some progress in discussions between Utraclaro and Teleperformance in recent weeks. “The union in Colombia still has a frivolous lawsuit hanging over its head but at least there are some ongoing discussions which could potentially move things forward,” says Christy Hoffman, secretary of UNI Global Union.
If Teleperformance reaches a deal with the union, it would be a significant moment for the outsourcing sector, which has historically been hostile to labor organizing.
“I’ve been in this kind of business since 2013,” says one Teleperformance customer service employee, “I can tell you that if you say that word [sindicato, Spanish for trade union], you will be out the next day. Sindicato? You can’t say that.”
Claudia says she had joined the union to win a better salary and improved mental health support. Luis also hoped staff would get a pay rise, and that the intense pressure on moderators would be reduced.
Álvaro, who lives alone in Bogotá and worked six days a week for the whole of last December after a holiday request was denied, says he just wants to spend Christmas with his family this year.
For now, however, TikTok’s low-paid moderators will keep working to their grueling targets, sifting through some of the internet’s most nightmarish content.
This story is the result of a partnership between TIME and The Bureau of Investigative Journalism, a non-profit newsroom based in London. The names of moderators who spoke with the Bureau have been changed to protect their identity.
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com