Kauna Malgwi had just started at university in northern Nigeria when the terrorist group Boko Haram began a major insurgency. The Islamist militants, whose name means “western education is forbidden,” carried out raids on universities like Malgwi’s, kidnapped female students, and set up roadside bombs. Once, Malgwi says, one of those bombs exploded just five minutes after her vehicle had passed it by. The militants were especially targeting Christians like her. In 2012, she fled with her mother and aunt south to Abuja, the nation’s capital.
Malgwi completed her education in Kenya, and in 2019 signed up for what she says she thought was a call center job for contractor Sama. In reality, she says, she discovered the job was as an outsourced content moderator for Facebook, a role that required viewing videos of wartime atrocities, rapes, and suicides, in order to remove them from the platform. (Sama disputes her allegation. “All team members understood the nature of the work before starting,” a spokesperson said.) The labor helped Facebook’s parent company Meta train its AI systems to detect similar content in the future. But in 2023, after a colleague blew the whistle on low pay and alleged union-busting at Sama, the contractor pulled the plug, making Malgwi and around 260 others redundant. Despite a Kenyan court later ruling that Meta was the primary employer of the moderators, the tech giant is appealing the decision, and has so far not paid the redundancy payments the moderators believe they are owed. Meta did not respond to a request for comment; Sama says its decision to end its contract with Meta was a business decision unrelated to the whistleblower allegations.
As the case drags on, Malgwi has emerged as one of the leading voices on the invisible data work that underpins so much of today’s advanced AI. She leads the Nigeria chapter of the Content Moderators Union, where she sees her responsibility as ensuring young people know their rights when they sign up to work for tech companies. That’s vital, she says, as Meta and others are now seeking contractors in jurisdictions other than Kenya for content moderation, in the wake of the legal challenges brought by Malgwi and her colleagues there. “The fear that you can just be laid off because you are raising concerns is beginning to diminish, because people are beginning to be aware that they have rights as workers,” Malgwi says of the work her team has already done in Nigeria. Her ambition is to spark similar efforts across the continent. “Imagine if from all the African countries there is such pressure—Big Tech will have no option but to do the right thing.”
Malgwi has even helped influence laws outside of Africa. In February, she testified at the European Parliament in Brussels, where she told lawmakers of the insomnia she still suffers as a result of her work; the paranoia she feels seeing men with small children after viewing videos of child abuse; and her pride at having likely saved lives by escalating terrorist content before attacks could be carried out. After her testimony, she wiped away tears. Two months later, the Parliament agreed to pass the Platform Work Directive, a law that will regularize the employment status and rights of platform workers like Uber drivers and content moderators in E.U. member states. Although Malgwi won’t directly benefit, she sees it as a step in the right direction that could have global ripple effects. “It will have a positive impact,” she says. “If such rules are to be made, I’m sure even Facebook knows one day things will also change for them.”
*Disclosure: Investors in Sama include Salesforce, where TIME co-chair and owner Marc Benioff is CEO.
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com