A Kenyan judge on Monday rejected Meta’s attempt to have its name struck from a lawsuit alleging widespread failings in its safety operations in Africa, in a significant blow to Facebook’s parent company.
The ruling means Meta will be “significantly subjected to a court of law in the global south” for the first time in its history, according to Amnesty International.
The lawsuit alleges that Meta and its local outsourcing partner Sama are responsible for multiple violations of the Kenyan constitution, including union-busting and worker exploitation. Those alleged abuses undermined the platform’s safety efforts targeted at millions of people across Africa, lawyers argue.
The lawsuit was filed by Daniel Motaung, a former Facebook content moderator for Sama in Kenya, who says he and his colleagues suffered severe post-traumatic stress disorder after being exposed on the job to graphically violent images and videos. Motaung alleges he was unlawfully dismissed after leading an effort to unionize his colleagues for better pay and working conditions—in events that were first reported by TIME last year.
Read More: Inside Facebook’s African Sweatshop
Meta’s lawyers had argued that because Meta does not formally operate in Kenya, the country’s legal system lacked the proper jurisdiction to hear the case.
But a judge in Nairobi’s employment and labor relations court ruled on Monday that Meta is a “proper party” to the case. Nairobi is the hub for Meta’s outsourced content moderation workforce in sub-Saharan Africa, which handles over a dozen languages spoken throughout the region.
Meta declined to comment.
The aim of the case is to “force Facebook to reform its exploitative moderation system, which has hurt Kenyan workers and undermined the safety of millions of Kenyan and other Africans,” Foxglove, a London-based legal NGO that is supporting Motaung, said in a statement.
“We are extremely pleased Facebook have been found to be proper parties to this case and we look forward to the day when Facebook will face justice for exploiting content moderators like Daniel,” said Cori Crider, co-director of Foxglove, in a statement. “We think it’s right that this trial be heard in Kenya, where the abuses happened. Mark Zuckerberg advertises his service to African users and profits from Kenyan advertising—but he refuses to invest enough resources to keep Kenyans safe and treat the key workers who protect Facebook with the dignity and humanity they deserve.”
The next hearing in the case is scheduled for March 8.
The ruling that Meta is susceptible to Kenyan law could also have implications for another major case against Meta in the Kenyan legal system. That case accuses the company of failing to prevent hate speech and incitement to violence in Ethiopia, where ethnic tensions fueled a brutal civil war in the Tigray region from November 2020 to November 2022.
The case, which like Motaung’s is also supported by Foxglove, argues that Facebook failed to hire enough content moderators with expertise in Ethiopian languages at its Nairobi hub; that its algorithm amplifies hateful content; and that Facebook had failed to invest appropriately in safety for its African users.
Sama, Facebook’s outsourcing partner in Kenya, announced in January that it is exiting the content moderation business, and that around 200 of its Facebook content moderators would lose their jobs. It had previously denied allegations of workers’ rights abuses and inadequate mental health support.
More Must-Reads from TIME
- How Donald Trump Won
- The Best Inventions of 2024
- Why Sleep Is the Key to Living Longer
- Robert Zemeckis Just Wants to Move You
- How to Break 8 Toxic Communication Habits
- Nicola Coughlan Bet on Herself—And Won
- Why Vinegar Is So Good for You
- Meet TIME's Newest Class of Next Generation Leaders
Write to Billy Perrigo at billy.perrigo@time.com