• World
  • India

Facebook’s Ties to India’s Ruling Party Complicate Its Fight Against Hate Speech

14 minute read

In July 2019, Alaphia Zoyab was on a video call with Facebook employees in India, discussing some 180 posts by users in the country that Avaaz, the watchdog group where she worked, said violated Facebook’s hate speech rules. But half way through the hour-long meeting, Shivnath Thukral, the most senior Facebook official on the call, got up and walked out of the room, Zoyab says, saying he had other important things to do.

Among the posts was one by Shiladitya Dev, a lawmaker in the state of Assam for Prime Minister Narendra Modi’s Hindu nationalist Bharatiya Janata Party (BJP). He had shared a news report about a girl being allegedly drugged and raped by a Muslim man, and added his own comment: “This is how Bangladeshi Muslims target our [native people] in 2019.” But rather than removing it, Facebook allowed the post to remain online for more than a year after the meeting, until TIME contacted Facebook to ask about it on Aug. 21. “We looked into this when Avaaz first flagged it to us, and our records show that we assessed it as a hate speech violation,” Facebook said in a statement to TIME. “We failed to remove upon initial review, which was a mistake on our part.”

Thukral was Facebook’s public policy director for India and South Asia at the time. Part of his job was lobbying the Indian government, but he was also involved in discussions about how to act when posts by politicians were flagged as hate speech by moderators, former employees tell TIME. Facebook acknowledges that Thukral left the meeting, but says he never intended to stay for its entirety, and joined only to introduce Zoyab, whom he knew from a past job, to his team. “Shivnath did not leave because the issues were not important,” Facebook said in the statement, noting that the company took action on 70 of the 180 posts presented during the meeting.

India Facebook
Shivnath Thukral at the Moving to Better Ground session during the India Economic Summit in Mumbai, November, 2011.Eric Miller—World Economic Forum

The social media giant is under increasing scrutiny for how it enforces its hate speech policies when the accused are members of Modi’s ruling party. Activists say some Facebook policy officials are too close to the BJP, and accuse the company of putting its relationship with the government ahead of its stated mission of removing hate speech from its platform—especially when ruling-party politicians are involved. Thukral, for instance, worked with party leadership to assist in the BJP’s 2014 election campaign, according to documents TIME has seen.

Facebook’s managing director for India, Ajit Mohan, denied suggestions that the company had displayed bias toward the BJP in an Aug. 21 blog post titled, “We are open, transparent and non-partisan.” He wrote: “Despite hailing from diverse political affiliations and backgrounds, [our employees] perform their respective duties and interpret our policies in a fair and non-partisan way. The decisions around content escalations are not made unilaterally by just one person; rather, they are inclusive of views from different teams and disciplines within the company.”

Facebook published the blog post after the Wall Street Journal, citing current and former Facebook employees, reported on Aug.14 that the company’s top policy official in India, Ankhi Das, pushed back against other Facebook employees who wanted to label a BJP politician a “dangerous individual” and ban him from the platform after he called for Muslim immigrants to be shot. Das argued that punishing the state lawmaker, T. Raja Singh, would hurt Facebook’s business prospects in India, the Journal reported. (Facebook said Das’s intervention was not the sole reason Singh was not banned, and that it was still deciding if a ban was necessary.)

Read more: Can the World’s Largest Democracy Endure Another Five Years of a Modi Government?

Those business prospects are sizeable. India is Facebook’s largest market, with 328 million using the social media platform. Some 400 million Indians also use Facebook’s messaging service WhatsApp — a substantial chunk of the country’s estimated 503 million internet users. The platforms have become increasingly important in Indian politics; after the 2014 elections, Das published an op-ed arguing that Modi had won because of the way he leveraged Facebook in his campaign.

But Facebook and WhatsApp have also been used to spread hate speech and misinformation that have been blamed for helping to incite deadly attacks on minority groups amid rising communal tensions across India—despite the company’s efforts to crack down. In February, a video of a speech by BJP politician Kapil Mishra was uploaded to Facebook, in which he told police that unless they removed mostly-Muslim protesters occupying a road in Delhi, his supporters would do it themselves. Violent riots erupted within hours. (In that case, Facebook determined the video violated its rules on incitement to violence and removed it.)

WhatsApp, too, has been used with deadly intent in India — for example by cow vigilantes, Hindu mobs that have attacked Muslims and Dalits accused of killing cows, an animal sacred in Hinduism. At least 44 people, most of them Muslims, were killed by cow vigilantes between May 2015 and December 2018, according to Human Rights Watch. Many cow vigilante murders happen after rumors spread on WhatsApp, and videos of lynchings and beatings are often shared via the app too.

Read more: How the Pandemic is Reshaping India

TIME has learned that Facebook, in an effort to evaluate its role in spreading hate speech and incitements to violence, has commissioned an independent report on its impact on human rights in India. Work on the India audit, previously unreported, began before the Journal published its story. It is being conducted by the U.S. law firm Foley Hoag and will include interviews with senior Facebook staff and members of civil society in India, according to three people with knowledge of the matter and an email seen by TIME. (A similar report on Myanmar, released in 2018, detailed Facebook’s failings on hate speech that contributed to the Rohingya genocide there the previous year.) Facebook declined to confirm the report.

But activists, who have spent years monitoring and reporting hate speech by Hindu nationalists, tell TIME that they believe Facebook has been reluctant to police posts by members and supporters of the BJP because it doesn’t want to pick fights with the government that controls its largest market. The way the company is structured exacerbates the problem, analysts and former employees say, because the same people responsible for managing the relationship with the government also contribute to decisions on whether politicians should be punished for hate speech.

“A core problem at Facebook is that one policy org is responsible for both the rules of the platform and keeping governments happy,” Alex Stamos, Facebook’s former chief security officer, tweeted in May. “Local policy heads are generally pulled from the ruling political party and are rarely drawn from disadvantaged ethnic groups, religious creeds or castes. This naturally bends decision-making towards the powerful.”

Some activists have grown so frustrated with the Facebook India policy team that they’ve begun to bypass it entirely in reporting hate speech. Following the call when Thukral walked out, Avaaz decided to begin reporting hate speech directly to Facebook’s company headquarters in Menlo Park, Calif. “We found Facebook India’s attitude utterly flippant, callous, uninterested,” says Zoyab, who has since left Avaaz. Another group that regularly reports hate speech against minorities on Facebook in India, which asked not to be named out of fear for the safety of its staffers, said it has been doing the same since 2018. In a statement, Facebook acknowledged some groups that regularly flag hate speech in India are in contact with Facebook headquarters, but said that did not change the criteria by which posts were judged to be against its rules.

Read more: Facebook Says It’s Removing More Hate Speech Than Ever Before. But There’s a Catch

The revelations in the Journal set off a political scandal in India, with opposition politicians calling for Facebook to be officially investigated for alleged favoritism toward Modi’s party. And the news caused strife within the company too: In an internal open letter, Facebook employees called on executives to denounce “anti-Muslim bigotry” and do more to ensure hate speech rules are applied consistently across the platform, Reuters reported. The letter alleges that there are no Muslim employees on the India policy team; in response to questions from TIME, Facebook said it was legally prohibited from collecting such data.

Facebook friends in high places

While it is common for companies to hire lobbyists with connections to political parties, activists say the history of staff on Facebook’s India policy team, as well as their incentive to keep the government happy, creates a conflict of interest when it comes to policing hate speech by politicians. Before joining Facebook, Thukral had worked in the past on behalf of the BJP. Despite this, he was involved in making decisions about how to deal with politicians’ posts that moderators flagged as violations of hate speech rules during the 2019 elections, the former employees tell TIME. His Facebook likes include a page called “I Support Narendra Modi.”

Former Facebook employees tell TIME they believe a key reason Thukral was hired in 2017 was because he was seen as close to the ruling party. In 2013, during the BJP’s eventually successful campaign to win national power at the 2014 elections, Thukral worked with senior party officials to help run a pro-BJP website and Facebook page. The site, called Mera Bharosa (“My Trust” in Hindi) also hosted events, including a project aimed at getting students to sign up to vote, according to interviews with people involved and documents seen by TIME. A student who volunteered for a Mera Bharosa project told TIME he had no idea it was an operation run in coordination with the BJP, and that he believed he was working for a non-partisan voter registration campaign. According to the documents, this was a calculated strategy to hide the true intent of the organization. By early 2014, the site changed its name to “Modi Bharosa” (meaning “Modi Trust”) and began sharing more overtly pro-BJP content. It is not clear whether Thukral was still working with the site at that time.

In a statement to TIME, Facebook acknowledged Thukral had worked on behalf of Mera Bharosa, but denied his past work presented a conflict of interest because multiple people are involved in significant decisions about removing content. “We are aware that some of our employees have supported various campaigns in the past both in India and elsewhere in the world,” Facebook said as part of a statement issued to TIME in response to a detailed series of questions. “Our understanding is that Shivnath’s volunteering at the time focused on the themes of governance within India and are not related to the content questions you have raised.”

Now, Thukral has an even bigger job. In March 2020, he was promoted from his job at Facebook to become WhatsApp’s India public policy director. In the role, New Delhi tech policy experts tell TIME, one of Thukral’s key responsibilities is managing the company’s relationship with the Modi government. It’s a crucial job, because Facebook is trying to turn the messaging app into a digital payments processor — a lucrative idea potentially worth billions of dollars.

In April, Facebook announced it would pay $5.7 billion for a 10% stake in Reliance Jio, India’s biggest telecoms company, which is owned by India’s richest man, Mukesh Ambani. On a call with investors in May, Facebook CEO Mark Zuckerberg spoke enthusiastically about the business opportunity. “With so many people in India engaging through WhatsApp, we just think this is going to be a huge opportunity for us to provide a better commerce experience for people, to help small businesses and the economy there, and to build a really big business ourselves over time,” he said, talking about plans to link WhatsApp Pay with Jio’s vast network of small businesses across India. “That’s why I think it really makes sense for us to invest deeply in India.”

Read more: How Whatsapp Is Fueling Fake News Ahead of India’s Elections

But WhatsApp’s future as a payments application in India depends on final approval from the national payments regulator, which is still pending. Facebook’s hopes for expansion in India have been quashed by a national regulator before, in 2016, when the country’s telecoms watchdog said Free Basics, Facebook’s plan to provide free Internet access for only some sites, including its own, violated net neutrality rules. One of Thukral’s priorities in his new role is ensuring that a similar problem doesn’t strike down Facebook’s big ambitions for WhatsApp Pay.

‘No foreign company in India wants to be in the government’s bad books’

While the regulator is technically independent, analysts say that Facebook’s new relationship with the wealthiest man in India will likely make it much easier to gain approval for WhatsApp Pay. “It would be easier now for Facebook to get that approval, with Ambani on its side,” says Neil Shah, vice president of Counterpoint Research, an industry analysis firm. And goodwill from the government itself is important too, analysts say. “No foreign company in India wants to be in the government’s bad books,” says James Crabtree, author of The Billionaire Raj. “Facebook would very much like to have good relations with the government of India and is likely to think twice about doing things that will antagonize them.”

The Indian government has shown before it is not afraid to squash the dreams of foreign tech firms. In July, after a geopolitical spat with China, it banned dozens of Chinese apps including TikTok and WeChat. “There has been a creeping move toward a kind of digital protectionism in India,” Crabtree says. “So in the back of Facebook’s mind is the fact that the government could easily turn against foreign tech companies in general, and Facebook in particular, especially if they’re seen to be singling out major politicians.”

With hundreds of millions of users already in India, and hundreds of millions more who don’t have smartphones yet but might in the near future, Facebook has an incentive to avoid that possibility. “Facebook has said in the past that it has no business interest in allowing hate speech on its platform,” says Chinmayi Arun, a resident fellow at Yale Law School, who studies the regulation of tech platforms. “It’s evident from what’s going on in India that this is not entirely true.”

Facebook says it is working hard to combat hate speech. “We want to make it clear that we denounce hate in any form,” said Mohan, Facebook’s managing director in India, in his Aug. 21 blog post. “We have removed and will continue to remove content posted by public figures in India when it violates our Community Standards.”

But scrubbing hate speech remains a daunting challenge for Facebook. At an employee meeting in June, Zuckerberg highlighted Mishra’s February speech ahead of the Delhi riots, without naming him, as a clear example of a post that should be removed. The original video of Mishra’s speech was taken down shortly after it was uploaded. But another version of the video, with more than 5,600 views and a long list of supportive comments underneath, remained online for six months until TIME flagged it to Facebook in August.

More Must-Reads From TIME

Write to Billy Perrigo at billy.perrigo@time.com