After spending 15 years working for some of the largest social media companies, like Google and Facebook, a whistleblower triggered a firestorm for Mark Zuckerberg’s company in recent weeks by releasing tens of thousands of pages of internal documents to The Wall Street Journal that show how the social media giant is well aware of its negative effects on users but has not addressed them. The documents show that Facebook’s platform Instagram causes harm, especially to young girls, due to the spread of misinformation.
The identity of the whistleblower was revealed in CBS News programme 60 Minutes on Sunday night as Frances Haugen, a 37-year-old data scientist with an MBA from Harvard who was a product manager working on civic integrity issues at the company.
While quitting Facebook on her own accord in May, Haugen, who will testify before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security on Tuesday, copied thousands of pages of internal research and communications that she shared with the Securities and Exchange Commission (SEC) in her complaint against Facebook. Her attorneys have filed, at least, eight whistleblower complaints with the SEC, according to CBS News. 60 Minutes obtained the SEC letters from a Congressional source.
Haugen’s lawyers state in the filings: “Our anonymous client is disclosing original evidence showing that Facebook, Inc. has, for years past and ongoing, violated US securities laws by making material misrepresentations and omissions in statements to investors and prospective investors, including, inter alia, through filings with the SEC, testimony to Congress, online statements and media stories.”
The documents, according to Haugen, show that Facebook knows its platforms are used to spread hate, violence and misinformation and it has tried to hide that evidence.
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimise for its own interests, like making more money,” Haugen told 60 Minutes correspondent Scott Pelley in the interview.
Haugen mentioned “systemic” problems with Facebook’s “ranking algorithm that led to the amplification of angry content and divisiveness. Evidence of that, she said, is in the company’s own internal research”.
“Facebook’s mission is to connect people all around the world. When you have a system that you know can be hacked with anger, it’s easier to provoke people into anger. And publishers are saying, ‘Oh, if I do more angry, polarising, divisive content, I get more money.’ Facebook has set up a system of incentives that is pulling people apart,” Haugen told Pelley.
Haugen said that Facebook changed its algorithm in 2018 to promote “what it calls meaningful social interactions” through “engagement-based rankings”. She explained that content that gets engaged with, like reactions, comments, and shares, gets wider distribution. Facebook’s own research found that “angry content” is more likely to receive engagement, something that content producers and political parties are aware of, she added.
“One of the most shocking pieces of information that I brought out of Facebook that I think is essential to this disclosure is political parties have been quoted, in Facebook’s own research, saying, we know you changed how you pick out the content that goes in the home feed,” said Haugen. “And now if we don’t publish angry, hateful, polarising, divisive content, crickets, we don’t get anything. And we don’t like this.”
Haugen added that Facebook recognises that “if they change the algorithm to be safer, people will spend less time on the site, they’ll click on less ads and they’ll make less money”.
Pelly quoted one internal Facebook document as saying: “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.”
Haugen said that she had “seen a bunch of social networks, and it was substantially worse at Facebook than anything I’ve seen before”. Finally, she realised that she had to expose the whole affair in a systemic way. “I’m going to have to get out enough [documents] that no one can question that this is real.”
When Facebook decided to dissolve its civic integrity team after the 2020 US presidential election, Haugen’s feelings about the company started to change. This decision and the move to turn off misinformation prevention tools helped then-US President Donald Trump’s supporters to use Facebook to organise the January 6 Capitol Hill riot.
“They basically said, ‘Oh good, we made it through the election. There weren’t riots. We can get rid of civic integrity now.’ Fast forward a couple of months, and we had the Insurrection. When they got rid of civic integrity, it was the moment where I was like, ‘I don’t trust that they’re willing to actually invest what needs to be invested to keep Facebook from being dangerous.’” she said.
Facebook has slammed the claims as “misleading”. “Everyday, our teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place,’ Facebook spokesperson Lena Pietsch told CNN Business immediately after the 60 Minutes interview. “We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.”
Pietsch released a 700-word statement laying out what Facebook called “missing facts” from the interview, which “used select company materials to tell a misleading story about the research we do to improve our products”.