Skip to main content
xYOU DESERVE INDEPENDENT, CRITICAL MEDIA. We want readers like you. Support independent critical media.

Internal Documents Show Facebook Wavered in Checking Misinformation, Hate Speech in India: Report

PTI |
In the document titled 'Adversarial Harmful Networks: India Case Study', "Facebook researchers wrote that there were groups and pages “replete with inflammatory and misleading anti-Muslim content.
Facebook’s news censorship in Australia: A battle between monopolies

New York: Internal documents at Facebook show "a struggle with misinformation, hate speech and celebrations of violence" in India, the company's biggest market, with researchers at the social media giant pointing out that there are groups and pages "replete with inflammatory and misleading anti-Muslim content" on its platform, US media reports have said.

In a report published on Saturday, The New York Times said in February 2019, a Facebook researcher created a new user account to look into what the social media website will look like for a person living in Kerala.

"For the next three weeks, the account operated by a simple rule: Follow all the recommendations generated by Facebook’s algorithms to join groups, watch videos and explore new pages on the site. The result was an inundation of hate speech, misinformation and celebrations of violence, which were documented in an internal Facebook report published later that month," the NYT report said.

"Internal documents show a struggle with misinformation, hate speech and celebrations of violence in the country, the company’s biggest market,” said the report based on disclosures obtained by a consortium of news organisations, including the New York Times and the Associated Press.

The documents are part of a larger cache of material collected by whistle blower Frances Haugen, a former Facebook employee who recently testified before the Senate about the company and its social media platforms.

The report said the internal documents include reports on how bots and fake accounts tied to the “country’s ruling party and opposition figures” were wreaking havoc on national elections.

The NYT said that in a separate report produced after the 2019 national elections, Facebook found that “over 40 per cent of top views, or impressions, in the Indian state of West Bengal were fake/inauthentic”. One inauthentic account had amassed more than 30 million impressions.

In an internal document titled 'Adversarial Harmful Networks: India Case Study', "Facebook researchers wrote that there were groups and pages “replete with inflammatory and misleading anti-Muslim content” on Facebook.

The internal documents also detail how a plan "championed" by Facebook founder Mark Zuckerberg to focus on "meaningful social interactions” was leading to more misinformation in India, particularly during the pandemic.

The NYT report added that another Facebook report detailed efforts by Bajrang Dal to publish posts containing anti-Muslim narratives on the platform.

"Facebook is considering designating the group as a dangerous organisation because it is “inciting religious violence” on the platform, the document showed. But it has not yet done so,” the NYT report said.

The documents show that Facebook did not have enough resources in India and was not able to grapple with the problems it had introduced there, including anti-Muslim posts.

A Facebook spokesman, Andy Stone, said Facebook has reduced the amount of hate speech that people see globally by half this year.

"Hate speech against marginalised groups, including Muslims, is on the rise in India and globally,” Stone said in the NYT report. “So we are improving enforcement and are committed to updating our policies as hate speech evolves online."

In India, "there is definitely a question about resourcing" for Facebook, but the answer is not "just throwing more money at the problem," said Katie Harbath, who spent 10 years at Facebook as a director of public policy, and worked directly on securing India’s national elections.

The NYT report said Facebook employees "have run various tests and conducted field studies in India for several years. That work increased ahead of India’s 2019 national elections”.

In late January 2019, a few Facebook employees travelled to India to meet with colleagues and speak to dozens of local Facebook users, it said.

"According to a memo written after the trip, one of the key requests from users in India was that Facebook 'take action on types of misinfo that are connected to real-world harm, specifically politics and religious group tension',” the report said.

The report added that after India’s national elections had begun, “Facebook put in place a series of steps to stem the flow of misinformation and hate speech in the country, according to an internal document called 'Indian Election Case Study'.

"The case study painted an optimistic picture of Facebook’s efforts, including adding more fact-checking partners — the third-party network of outlets with which Facebook works to outsource fact-checking — and increasing the amount of misinformation it removed.

"The study did not note the immense problem the company faced with bots in India, nor issues like voter suppression. During the election, Facebook saw a spike in bots — or fake accounts — linked to various political groups, as well as efforts to spread misinformation that could have affected people’s understanding of the voting process."

 Citing the Facebook report, the NYT said that of India’s 22 officially recognised languages, Facebook has trained its AI systems on five. But in Hindi and Bengali, it still did not have enough data to adequately police the content, and much of the content targeting Muslims "is never flagged or actioned."

Whistleblower Haugen to Testify in UK

AP reports from London: Former Facebook data scientist turned whistleblower Frances Haugen plans to answer questions on Monday from lawmakers in the United Kingdom who are working on legislation to rein in the power of social media companies.

Haugen is set to appear before a parliamentary committee scrutinising the British government's draft legislation to crack down on harmful online content, and her comments could help lawmakers beef up the new rules.

She's testifying the same day Facebook is expected to release its latest earnings.

It will be her second appearance before lawmakers after she testified in the US Senate earlier this month about the danger she says the company poses, from harming children to inciting political violence and fuelling misinformation.

Haugen cited internal research documents she secretly copied before leaving her job in Facebook's civic integrity unit.

She told US lawmakers that she thinks a federal regulator is needed to oversee digital giants like Facebook, something that officials in Britain and the European Union are already working on.

The UK government's online safety bill calls for setting up a regulator that would hold companies to account when it comes to removing harmful or illegal content from their platforms, such as terrorist material or child sex abuse images.

 “This is quite a big moment,” Damian Collins, the lawmaker who chairs the committee, said ahead of the hearing.

 “This is a moment, sort of like Cambridge Analytica, but possibly bigger in that I think it provides a real window into the soul of these companies.”

 Collins was referring to the 2018 debacle involving data-mining firm Cambridge Analytica, which gathered details on as many as 87 million Facebook users without their permission.

 Haugen also is scheduled to meet next month with European Union officials in Brussels, where the bloc's executive commission is updating its digital rulebook to better protect internet users by holding online companies more responsible for illegal or dangerous content.

 Under the UK rules, expected to take effect next year, Silicon Valley giants face an ultimate penalty of up to 10% of their global revenue for any violations.

 The EU is proposing a similar penalty.

 The UK committee will be hoping to hear more from Haugen about the data that tech companies have gathered.

 Collins said the internal files that Haugen has turned over to US authorities are important because it shows the kind of information that Facebook holds — and what regulators should be asking when they investigate these companies.

The committee has already heard from another Facebook whistleblower, Sophie Zhang, who raised the alarm after finding evidence of online political manipulation in countries such as Honduras and Azerbaijan before she was fired.

Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.

Subscribe Newsclick On Telegram

Latest