Skip to main content
xYOU DESERVE INDEPENDENT, CRITICAL MEDIA. We want readers like you. Support independent critical media.

Release India Human Rights Impact Assessment report: Activists to Facebook

Sabrang India |
The report could shed light on the exact role played by Facebook in helping cultivate an ecosystem of Hate in India.
Release India Human Rights Impact Assessment report: Activists to Facebook

For nearly two years now, Indian and international rights organisations have waited for Meta Platforms Inc. (formerly Facebook Inc.) to release the Human Rights Impact Assessment (HRIA) on India report. Trailblazers leading the campaign against hate crimes hoped the report would help them understand the extent of hate and communal violence allowed on the company’s social media platforms. However, as months pass by with no hint of the company’s willingness to release the report, whistleblowers, activists and experts came together to discuss Meta’s role in today’s hate ecosystem.

On January 19, 2022, Real Facebook Oversight Board, which is an emergency response to ongoing harms on Meta’s platforms comprising global scholars, experts and advocates, as well as India Civil Watch International and Citizens for Justice and Peace (CJP) called for a virtual press briefing. Speakers such as CJP Secretary Teesta Setalvad, Delhi Minorities Commission ex-Chairman Dr Zafarul Khan, Facebook ex-Vice President Brian Boland and whistleblowers Sophie Zhang and Frances Haugen discussed the company’s human rights problems in India and allegations that Meta / Facebook is deliberately delaying the HRIA report.

Host Kyle Taylor mentioned how over 20 international civil society organizations sent a letter to Meta demanding the report’s immediate release on January 3 but received no favourable response. “The role of the company in human rights issues and unrest in countries around the world is deeply troubling. India is at the top of that list, where the use of Facebook’s [Meta’s] platforms including WhatsApp have sparked a great amount of violence,” he said.

Setalvad shed further light on this while highlighting India’s recent assessment by various international reports as a “partly free” and a “partially free electoral democracy” and its falling rank in the Freedom Index. She spoke about the recent hate against minority groups such as the call for communal genocide in December 2021, attacks on Christians during Christmas time, the auction of Muslim women via social media apps. “Within this overall scenario, Facebook India as a platform is playing a dangerous role. It has a vast clientele of over 460 million users in English and 22 Indian languages and allows unchecked inflammatory content that has become an instrument for targeting minorities, Dalits and women,” she said.

She further cited numerous interactions between CJP and Facebook India in October 2018, 2019 and March 2021, where the company had a lukewarm response to the various successive complaints by CJP about how it was allowing rampant hate to spread on its platform. During the latest interaction, she recollected that Facebook finally concluded that hate-spreader Raja Singh violated its Community Standards (Objectionable Content) and Violence and Criminal Behaviour rules, and removed him from the site. However, his fan pages and followers continue to generate provocative content.

Similarly, during the 2020 Delhi violence, CJP sent two complaints regarding the hate speech promoted by hate-monger and Rashtriya Azad Manch member Ragini Tiwari. In response Facebook said it is not in a position to take any action against Tiwari but suggested that CJP contact the party directly.

This is just like another instance in November 2018, when CJP complained against a post by Yati Narsinghanad that told Hindus to arm themselves 24x7 and called Islam ‘cancer’ but was told by Facebook that the post did not violate community standards.

“We have tried to engage however, and whenever given the chance, have had detailed correspondences, attended India Roundtables, have offered more than a dozen and a half of minute case studies and many, more complaints that, have unfortunately resulted in unsatisfactory results. All this work has also been at a risk and cost as the government tracks the critiques and dissenters,” said Setalvad.

Regarding the HRIA report, Setalvad highlighted the extensive meetings where CJP presented its analyses and case studies, complaints and responses in great detail, and how it was disappointing to see that a mega corporation such as Meta skirted the call for accountability and transparency.

“Facebook needs to engage with the issue directly, honestly and comprehensively. It must release the full, undiluted, and unredacted HRIA on India,” she said.

The entire text of Teesta Setalvad’s speech may be read here:

Regarding future steps after the release of the report, Setalvad said Facebook has to invest more in non-English languages. India alone has 22 official languages. These are mediums that carry the bulk of the hate on the platform. As such, she called for experts in non-English language to be brought in by the company. She also said that Facebook must engage with the victims groups, target groups and civil society.

While Setalvad talked about the rampant hate speech and the role of the company, Delhi Minorities Commission ex-Chairman Dr Zafarul Khan spoke about the history of India’s ruling regime, the Bharatiya Janata Party (BJP), in using technology to spread hate.

“BJP believes in using unscrupulous means to retain power. Riots, polaritisation along social lines included,” he said.

He cited how in 2005, the BJP used SMS in Assam to evict Bangaldeshis from Dibrugarh district. Around 10,000 people fled home at the time, fearing violence. When the BJP IT Cell was created, he said the Home Minister Amit Shah had the power to get information of 300 million Indians instantly – a quarter of the national population. This power was especially manipulated during the Covid-19 pandemic when a term “corona jihad” was coined to further oppress Muslims.

“For months, Muslims were maltreated and even lynched. Both Delhi and the Centre maintained daily data of Covid-victims and a separate mention of Tablighi Jamat victims, supporting the anti-Muslim narrative,” said Khan, who intervened in April 2020 with the police and stopped such data categories. The Centre soon followed suit.

Recent revelations show that the IT cell consists of around 5,000 full-time workers and hundreds of thousands of part-timers and sympathisers. The ruling regime also has an industrial scale propaganda app Tek Fog that sends millions of propaganda messages every day. These messages are replicated in regional languages by state-level IT cells. Khan called Facebook a beneficiary and offender in all of this.

“Business will do anything to earn an extra buck. Hate will go to any length to spread evil messages to divide and polarise society on religious lines. FB continues to fail to remove hate speech,” said Khan. He argued that the company with more than 300 million users and over 400 million WhatsApp users had a huge stake in India.

Having compiled a crucial report on the North-East Delhi anti-Muslim riots in 2020, he said that the incident too was a result of hate messaging. At the time, the government had passed the Citizenship Amendment Act (CAA) which along with National Population Register (NPR) and National Register of Citizens (NRC), stood to affect the citizenship of around 30 or 40 million Muslims. This sparked huge protests across India that went on for three months at around 800 locations.

The riot was an IT-cell planned attack to stop these protests, said Khan. It cost the death of around 55 people, ruined hundreds of houses and 19 mosques. Since he held the position of DMC Chairman at the time, Khan sent notices to police but to no avail. He also sent a letter to the Chief Justice of India detailing 87 hate posts but even the CJI did not take notice.

“After the violence, police started implicating victims. They implicated the victims as planners and executors of the riot. My report showed how all the activity leading to it was by the BJP IT cell. But still, they did not question Kapil Mishra or Ragini Tiwari. Indian humans have been practically de-humanised,” he said.

Further, Facebook ex-Vice President Brian Boland voiced his concerns that the Facebook algorithm promoted harmful content – a fact that did not interest the company. He spoke about the crowd tangle, a tool that helped journalists with transparency. However, its trend data showed concerning data that was disregarded by Facebook. “It was about controlling the story not understanding the impact with FB,” said Boland.

He also pointed out that FB’s work to ensure safety and transparency on the platforms was tiny in terms of investment. Where the company talked about investing two and half million dollars on safety, it had already invested over 13 billion dollars on shareholders and stock buy-backs in the same 3-month quarter). During the five years of safety investment, 50 billion dollars were invested in stock and shareholder buybacks.

“The priority is clear. People who use Facebook are not a priority. They will invest billions in protecting shareholders and much less on protecting people,” he said.

Bolan called for pressure to be built on Facebook to release the HRIA report, and suggested that legislation can help increase transparency and accountability in the “business”.

Whistleblower Sophie Zhang highlighted the need for this by uncovering the political manipulation of hate on Meta platforms. Earlier, she testified that FB negligence was defacto, letting authoritarian regimes manipulate public discourse.

“FB feels no responsibility to protect democracy and clean clutter. Politicised decision making was largest in India. When they caught Facebook accounts manipulating Indian politics, they promised to take action. But when they realized the hate came from a BJP MP there was radio silence,” said Zhang. She spoke about how large-scale use of fake accounts is legitimised in India under the excuse of IT cells. She alleged that Facebook knew all along about BJP’s violation of community standards and refused to act. “The policy department is the same as those responsible for lobbying governments. This leads to political interference,” she said.

Similarly, Frances Haugen said that the company prioritises own interest over public safety. Having worked as a lead project manager on a civic misinformation team dealing with issues of democracy and civic misinformation, she knew FB intentionally kept its operation behind a veil.

“Before my testimony, there was only anecdotal information. We must push for mandatory transparency because Facebook hides unfavourable results through volunteered transparency. It does not show performance of hate safety systems,” she said.

Haugen also argued that the company did not invest in India at an appropriate level although it is Facebook’s largest customer population. She said the company must be required to share what its safety system is for a linguistically diverse society. Otherwise, India will not get the safety it deserves, she said.

In all, speakers agreed that Meta is not cooperation because it knows what’s in the report is bad. For this reason, members stressed all the more for the release of the HRIA report.

Courtesy: Sabrang India

Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.

Subscribe Newsclick On Telegram

Latest