For years, Facebook has faced allegations that it has failed to prevent the spread of harmful content in Ethiopia, a country wracked by ethnic violence in a divisive civil war. Now, Facebook’s parent company Meta has been hit with a lawsuit, alleging that the platform’s amplification of—and failure to remove—hateful posts contributed to the deaths of ethnic minorities in Ethiopia.
One of the lawsuit’s two plaintiffs is Abrham Meareg, a Tigrayan man whose father was killed in an attack that he says was a direct result of ethnically-motivated misinformation shared on the platform.
Meareg’s father, Meareg Amare, was a respected chemistry professor at Bahir Dar University in the Amhara region of Ethiopia, according to his son’s witness statement accompanying the lawsuit, filed in Nairobi, Kenya. As a Tigrayan, Amare was an ethnic minority in the region. In the fall of 2021, as conflict escalated between Amharas and Tigrayans in the Ethiopian civil war, several accounts on Facebook shared Amare’s name and photograph, and posted comments accusing him of being a “snake” and posing a threat to ethnic Amharas. Although his son saw and reported many of the posts to the platform, Facebook declined to remove them, the witness statement alleges.
Read More: Inside Facebook’s African Sweatshop
On Nov. 3, 2021, a group of men followed Amare home from the university and shot him dead outside his home, the lawsuit says. He lay dying in the street for seven hours, the lawsuit adds, with the men warning onlookers that they too would be shot if they gave him medical assistance.
“I hold Facebook responsible for my father’s killing,” Abrham Meareg told TIME. “Facebook causes hate and violence to spread in Ethiopia with zero consequences.”
The other plaintiff in the case is former Amnesty International researcher, Fisseha Tekle, who gathered evidence of Facebook posts that the lawsuit says contributed to real-world killings. His work led to him and his family becoming targets of abuse, the lawsuit says.
Ethiopia has long been a key example cited by critics of Facebook’s role in ethnic violence internationally, along with Myanmar, where Facebook has admitted it did not do enough to prevent what some observers have labeled a genocide. In 2021, documents leaked by former Facebook employee Frances Haugen revealed that staff at the platform knew it was not doing enough to prevent armed groups in Ethiopia using the platform to spread ethnic hatred. “Current mitigation strategies are not enough,” one of the internal documents said. But the new lawsuit is the first to directly present allegations of Facebook posts leading to deaths there.
The lawsuit demands Meta impose measures to further reduce the spread of hatred and incitement to violence in Ethiopia. The company took similar “break glass” steps during the January 6 riot at the U.S. Capitol in 2021, when the platform “down-ranked” content that its algorithms determined posed a risk of incitement to violence. The plaintiffs are petitioning the court to force Meta to create a $1.6 billion fund for “victims of hate and violence incited on Facebook.” The lawsuit also proposes that Facebook hire more content moderators with Ethiopian language expertise at its Africa hub in Nairobi, where TIME exposed low pay and alleged workers’ rights violations in an investigation earlier this year.
Lawyers for Tekle and Meareg said they filed the lawsuit in a court in Kenya rather than in Ethiopia, because Nairobi is the base for Facebook’s content moderation operation in Sub-Saharan Africa. “Nairobi has become a Hub for Big Tech,” Mercy Mutemi, the lawyer representing the plaintiffs, said in a statement. “Not investing adequately in the African market has already caused Africans to die from unsafe systems. We know that a better Facebook is possible—because we have seen how preferentially they treat other markets. African Facebook users deserve better. More importantly, Africans deserve to be protected from the havoc caused by underinvesting in protection of human rights.”
Haugen’s disclosures “show Facebook knows that this is a really serious problem, that their software design is promoting viral hate and violent inciting posts,” says Rosa Curling, co-director at the legal nonprofit Foxglove, which is supporting the case. “They are not doing anything to change that, and on the face of it it looks as if that’s being done simply for the benefit of their profits.”
In a statement, a Meta spokesperson said: “We have strict rules that outline what is and isn’t allowed on Facebook and Instagram. Hate speech and incitement to violence are against these rules, and we invest heavily in teams and technology to help us find and remove this content. Feedback from local civil society organizations and international institutions guides our safety and integrity work in Ethiopia. We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya.”
Facebook has 21 days to respond to the lawsuit in the Nairobi court.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Billy Perrigo at billy.perrigo@time.com