Facebook Files: The findings from the internal documents – culture

What the whistleblower Frances Haugen triggered for Facebook: violent allegations, journalists in Europe and the USA sift through thousands of pages of explosive material from the heart of the company. The documents give a rare insight into the inner workings of the otherwise secretive company. Several hearings in the US Congress followed its publication. In addition, there is a global total failure, impending regulation, worried investors and the share is in decline. Even for Facebook, where a lot of experience was gained in dealing with crises, it was a turbulent week.

Facebook is more than a successful tech company with an idiosyncratic sole ruler. Mark Zuckerberg controls three huge communication platforms with Facebook, Instagram and Whatsapp. What he decides can influence the worldview of millions of people. 2.89 billion people have subscribed to the services. Internal documents now show that in some cases he placed the well-being of Facebook above the well-being of its users. It is therefore important to take a closer look and bring order to the chaos of the past month and a half – first in the course of events, then in the content of the documents.

Published in mid-September Wall Street Journal the first extract from the so-called Facebook files. They are internal documents from the Facebook group, including studies, presentations and chat logs from employees. Behind it is Frances Haugen, who herself worked as a product manager for Facebook until spring and passed the material on to the media. She initially confided in Jeff Horwitz, technology reporter at Wall Street Journal. That started with the evaluation. The Whistleblower Aid organization helped her to secure herself legally and to find further helpers. Lawyers and a communications agency were added. Shortly before the first publications in Wall Street Journal there was also a focus on political communication. Since then, she has been looked after by the Bryson Gillette agency founded by Bill Burton, former White House deputy press secretary under Obama.

Destructive verdicts: Whistleblower Frances Haugen at her hearing in the US Senate subcommittee on October 5th.

(Photo: Drew Angerer / Drew Angerer / AFP)

Haugen revealed her identity on the CBS newscast “60 Minutes” and spoke to SZ, NDR and WDR. The whistleblower approached the research cooperation early on. She testified as a witness in the US Senate and filed eight complaints with the US Securities and Exchange Commission. An employee of the congress provided the French daily newspaper to a consortium of international media organizations – which also includes SZ, NDR and WDR Le Monde, the danish newspaper Berlingske, the Belgian news magazine Crack and the Swiss media group Tamedia – redacted versions of the documents are available. The non-profit organization OCCRP (Organized Crime and Corruption Reporting Project) helped to make the documents machine-readable.

The findings from the documents are far-reaching. Many stories about the leak will appear around the world in the coming weeks. According to the current status, these are the seven central findings from the Facebook files as well as the statements of Haugen and other whistleblowers.

1. Facebook has always been aware of its risks and side effects

Got it five years ago an internal memo to the public. “The ugly truth is that we believe so firmly in connecting people that anything that allows us to connect more people is good per se,” wrote Facebook executive Andrew Bosworth. Global networking might cost human lives, and terrorist attacks might be coordinated via Facebook. Still, you’re doing the right thing.

The message was headed “The Ugly”, anticipating what the Facebook files would reveal. Despite all the talk of world improvement, Facebook knew about its downsides. Over the years, internal developers have repeatedly pointed this out, but many warnings have been ignored or ignored. Again and again, Facebook vowed improvement after critical research. The documents cast doubt on how serious the apologies were – after all, Zuckerberg should only have listened to his own employees.

2. Zuckerberg lied to the public

Almost three billion people use Facebook’s different platforms, and supposedly they are all the same. At least that is what Zuckerberg maintained over and over again, even under oath before the US Senate. It was a false statement because some users are the same: A secret program called “XCheck” formulated exceptions for around six million accounts. The mostly prominent owners were allowed to break the house rules without being sanctioned.

This included the footballer Neymar, who showed the real name and nude pictures of a woman who accused him of rape in a live stream. Normal accounts would have been deleted, Neymar’s video stayed online for 24 hours, nothing happened to his account. Last year, posts were viewed more than 16 billion times that violated Facebook’s guidelines but were delayed due to XCheck being deleted.

3. Instagram can harm teenagers’ mental health

A third of girls who feel uncomfortable about their bodies feel worse when they use Instagram. This is one of many numbers from several internal studies that suggest one conclusion: the content that young people see on Instagram is a danger to their self-esteem, can trigger depression or even cause suicidal thoughts.

Politicians have been presenting these studies to Facebook for weeks, angry parents worry about their children. The group defended itself aggressively and felt that it was treated unfairly. In fact, some of the surveys are based on small samples, not every correlation is a causality, and bring apps like Tiktok and Youtube just as many problems with yourself. But there is one thing that Facebook cannot ignore: there has been internal evidence on Facebook for years that Instagram triggers great pressure and insecurity among many teenagers. The public only found out about it through the leaks.

4. Facebook wants to win over children

For many parents it is a horror idea, for Facebook it is an important part of the strategy: in the future, children should also use the Group’s apps. Behind this is a great fear of competition. With hundreds of diagrams and tables, internal evaluations show that the young target group is migrating. Most people have long regarded Facebook as their parents’ platform, and Instagram is also becoming increasingly uncool. US teenagers prefer to use Snapchat and Tiktok.

Instagram itself described this development as an “existential threat” as early as 2018, but at the same time tried to downplay the problem among investors. Now is the time to take countermeasures: So far there is only one product for children with the Messenger Kids, officially a minimum age of 13 years applies to all other apps. In reality, however, Facebook can hardly control who logs in anyway. Therefore, a children’s version of Instagram is to be created. After the current criticism, the work was paused, but probably not for a long time.

5. Facebook’s artificial intelligence is difficult to understand

Tens of thousands of people view problematic content on behalf of Facebook. These so-called content moderators mostly work for third-party companies, their work is stressful, and their pay is meager. Above all, they cannot keep up with the deletion. That is why Facebook wants to replace people with machines. For years, Zuckerberg has praised the great strides that Facebook’s systems have made when it comes to detecting hate speech, atrocity videos or terrorist content.

Some of Facebook’s employees distrust this promise of salvation. “We will probably never have a model that detects the majority of integrity violations, especially in sensitive areas,” wrote a researcher in 2019. He speaks of a detection rate of two percent, another internal study names three to five percent. Facebook points out that it is becoming increasingly unlikely that users will encounter hate speech on the platform. This metric is more relevant than the recognition rate. One thing is clear: content moderation will probably never work perfectly – and Facebook will always pick the numbers that best fit its own narrative.

6. First come the USA, then nothing comes for a long time

Although Facebook formed a team in 2020 to protect users from misinformation during the US presidential election, users outside the English-speaking world are much less protected. Example Arabic: Although more than 220 million users speak Arabic and thus make up the third largest group on the platforms, there is an acute lack of moderators for the various countries. An internal study criticizes the fact that the moderators come mainly from Morocco and Syria, but that they cannot really classify content from, for example, Yemen or the Gulf region. The internal report makes it clear that Facebook assesses almost every single country in which mainly Arabic is spoken as a “high-risk country”, in which there are diverse challenges such as “terrorism or human trafficking”. The development of artificial intelligence, which is supposed to detect hatred and agitation before the content is shown to users, is expensive and very inadequate in other languages. In Arabic, for example, the algorithms cannot differentiate between quotes from the Koran and actual calls for violence. Facebook itself says it employs 15,000 moderators for more than 70 languages ​​in the world, including Arabic in Yemen, Libya, Saudi Arabia and Iraq.

7. For Facebook, profit is what counts first and foremost

Popular parties complained about a “civil war on social media” and perverted incentive systems, extremist parties were happy that their provocations reached a particularly large number of people: This was the reaction from European politics when Facebook changed the logic in 2018 according to which content appeared in the news feed . Before that, positive and negative content was shared in a balanced ratio, said the social media department of a Polish party. After Facebook’s intervention, attacks on the political opponent dominated – because that is exactly what the algorithm rewards.

This example is just one of many in which Facebook has operated on the open heart of the world. You tweak the algorithm to get people to spend more time on Facebook – and only find out months or years later that there were unwanted side effects. Some of the risks are already clear beforehand: Facebook likes to boast of its research department, which is unique in the industry, but part of this team is frustrated and angry. Although Facebook commissioned studies, it did not implement the recommendations. In the end, the marketing department almost always makes the decision.

“I have seen again and again how Facebook deals with it when there is a conflict between profit and security,” said Haugen during the hearing in the US Senate. “Facebook regularly resolves these conflicts for the benefit of its profit.”

You can find all the texts about the “Facebook Files” here.

.
source site