Death threats allowed – media – SZ.de

Facebook and Instagram users can now wish Russian President Vladimir Putin’s death on these platforms. This applies at least to accounts from Ukraine, Russia, Georgia and Poland and from six other Eastern European countries. As reported by the Reuters news agency, has the meta group, formerly Facebook, to which the social media platforms belong, sent an instruction to its moderators in the course of the Russian war of aggression against Ukraine. They should make an exception for hate speech directed at Russian soldiers and civilians when the context of the speech clearly relates to the Russian invasion. Content that normally violates Facebook’s and Instagram’s own guidelines is allowed under strict conditions – and should not be deleted by the moderators. “We’re doing this because we realized that in this specific context, ‘Russian soldiers’ is representative of the Russian military,” says an internal email quoted by Reuters.

Death threats against Vladimir Putin or the Belarusian President Alexander Lukashenko are now also temporarily allowed – with the restriction that they are only made in general terms and contain no details about the possible execution of the crime, Reuters quotes internal emails as saying. However, the exception rule does not apply to hate speech directed against prisoners of war or Russians themselves.

A Meta spokeswoman confirmed the exception to the SZ on request: Posts such as “Death to the Russian invaders” are “temporarily permitted as forms of political expression” in the light of the ongoing invasion. “These are temporary measures aimed at protecting the voices and expressions of people threatened by invasion.”

Both Facebook and Twitter saw a spike in Russian disinformation surrounding the start of the Ukraine invasion. Twitter deleted Thursday after a Russian Bombing of a hospital in Mariupol Tweets from Russian embassies claiming that the photo of an injured pregnant woman showed an “actress” with “very realistic make-up”. And Meta announced in late February that it had blocked a smaller network of accounts and pages that had been spreading disinformation at the start of the war of aggression in Ukraine.

“Facebook is playing with fire”

Shortly after the Russian attack began, there was a change on Facebook regarding the Azov regiment, a far-right, ultra-nationalist mercenary group fighting for the Ukrainian National Guard against Russia. The group had long been treated by Meta as an extremist group that was not allowed to be represented or supported on its platforms. According to research by the US investigative platform “The Intercept” Restrictions on Azov were eased at the end of February, allowing users to applaud their role in Ukraine’s defense. However, the group is still not allowed to recruit new fighters via Facebook.

Experts are quite critical of the exemption: “Facebook is playing with fire,” says Felix Kartte, from the Reset lobby initiative, which advocates stricter regulation of tech companies in Germany and Europe. The exemption is “not only fodder for Putin’s propaganda machine, it is also a security risk for people of Russian origin in the Baltic States or in Germany.”

In fact, the Russian Embassy in Washington DC described the measure as “extremist” and asked Meta to suspend them immediately. Facebook and Instagram would “pit nations against each other”. The state news agency TASS quoted Duma spokesman Vyacheslav Volodin as calling for Meta to be prosecuted for inciting violence against Russians.

The deletion practices of Facebook and Instagram have long been criticized. “Facebook does not manage to precisely moderate content that is easy to distinguish, such as racist hate speech. In this critical situation, Facebook should proceed according to transparent criteria and, above all, allocate sufficient resources to protect people from digital and physical violence,” says disinformation expert Felix Karte.

The persecution of the Rohingya is seen as a prime example of the problems Meta has with moderation

In particular, Facebook’s role in armed conflicts is quite controversial. In 2018, Facebook’s CEOs Mark Zuckerberg and Sheryl Sandberg had to admit that they had done too little to counter hate speech against the Rohingya in Myanmar. For years, the Muslim minority was threatened and denigrated by nationalist Buddhists on Facebook, false reports circulated about alleged crimes by Rohingya, which incited violence – which then broke out again and again. The ongoing persecution of the Rohingya has claimed an estimated 25,000 lives and displaced more than a million people.

The alleged genocide in Myanmar is considered a prime example of the problems Meta has in moderating content in crisis areas. Former Facebook employee Frances Haugen, who has been whistleblowing since the summer, has also sharply criticized the moderators’ insufficient language coverage. In many languages ​​other than English, dangerous content is rarely tracked. One document shows that the US market accounts for 84 percent of the time budget for deleting and containing disinformation. Facebook disputes this figure in a statement.


source site