Whistleblower Frances Haugen: “Facebook has blood on her hands”

Status: 03/15/2022 18:03

Last year, Frances Haugen’s revelations put a lot of pressure on Facebook’s parent company, Meta. Now the whistleblower is following up – but also talks about possible solutions.

By Katharina Wilhelm, ARD Studio Los Angeles, z. Currently in Austin

Does Facebook have blood on its hands? That’s what Frances Haugen was asked in a conversation at the SXSW media fair. She takes a deep breath: “Facebook has known for a long time that its decisions to maximize profits hurt people,” says Haugen. “Children hurting themselves, contributing to suicides, fueling racial conflicts around the world. So yes, Facebook has blood on its hands.”

Inappropriate handling of hate speech

Haugen suddenly became known as a whistleblower last year. She had passed on internal documents to journalists from her time as a Facebook employee. These showed, among other things, that Facebook – now the meta group – had kept under wraps studies dealing with the mental health of young people and the use of social media.

Haugen had worked for a number of years at Facebook on a team tasked with dealing with misinformation and fake news outside of the United States. Facebook is accused of fueling conflicts in different regions by not responding adequately to hate speech. Haugen cites Ethiopia and Myanmar as examples. In December, the Rohingya Muslim minority group sued Facebook, accusing the company of complicity in the 2017 genocide in Myanmar.

Algorithms prefer English – and violent videos

Haugen says the problem is Facebook’s omnipresence and power in non-Western nations. “Facebook went to these countries and said: If you want to use the open Internet, you have to pay. But if you want to use our closed Internet, it’s free.” Many people who speak other languages ​​use Facebook for up to 90 percent of the web content they look at. “When we talk about misinformation in the US and Europe, we have to realize that we still have a choice – millions of others don’t have that choice,” Haugen said.

Facebook focuses its machine learning too much on popular languages ​​like English or Spanish – and not on dialects and minority languages. That makes it much harder to identify and delete hate speech. In addition, the algorithm is still designed to focus on content that promises a lot of interaction. According to Haugen, this is content that tends to trigger negative feelings. The algorithm also favors live videos – even though bad things happen regularly, such as suicides or killing sprees.

Read first, then share

Haugen not only brought criticism to the fair. During her hour-long talk, she also talks about ways to make Facebook a better network – for example, by encouraging users to read articles before sharing them. Twitter, she notes, requires users to click on a link before sharing it. This small action reduces false information by ten to 15 percent.

Another suggestion from Haugen: to have more content screened by people instead of by artificial intelligence. And at the same time, take better care of the people who need to filter problematic content; because they often suffered from post-traumatic stress disorders. But Facebook also had to accept a small cut in profits.

Frances Haugen’s insights weren’t much new, but they are more relevant than ever given the war in Ukraine and the flood of misinformation and fake news. After her presentation there was standing applause for the whistleblower.

Facebook has blood on her hands – Frances Haugen at SXSW

Katharina Wilhelm, ARD Los Angeles, currently Austin, March 15, 2022 1:31 p.m

source site