Protection against sexual violence: Is “chat control” coming?

Status: 20.06.2024 01:51

The EU is making a new attempt to force IT companies to monitor users more strictly. The aim is to better protect children from sexual violence. Critical voices are coming from Germany.

There is renewed movement on a law to protect children from sexual violence. It has been on the table as a proposal for more than two years.

EU Home Affairs Commissioner Ylva Johansson explains: “It is there to protect children from sexual violence. It is there to protect the victims of these terrible crimes from having to relive them over and over again on the internet.”

Increased tenfold in one year

Three out of five depictions of sexual violence known worldwide are located on a server in an EU country, according to data from the Internet Watch Foundation. In Germany, the volume has increased tenfold within a year.

The core purpose of the proposed EU law is to oblige Google, Meta and others to technically scan all content on their platforms for images of abuse. This is why the plan has also become known as “chat control”, especially in Germany.

Germany rejects the law

The Federal Republic continues to reject the EU law, FDP Justice Minister Marco Buschmann made clear: “Germany is a country that has already experienced two dictatorships that showed no regard for privacy. That is why we are particularly sensitive there and will take great care to ensure that privacy, especially private communication, continues to be protected.”

The end justifies the means – that is how many other states see it. A new proposal from the Belgian Council Presidency could lead to the necessary majority among the 27 member states – even without Germany.

Audio messages excluded

There are some new features in the current version of the bill: If platforms are required to automatically scan all content, including chats with end-to-end encryption, then voice messages are now to be excluded. This means that only images and videos would be scanned.

In addition, users would first have to agree to the automated screening. If they do not do so, they would no longer be able to send pictures and videos on the respective platform.

Some providers are threatening to leave: Most notably, the operators of the messaging services Signal and Threema have announced that they will leave the EU if they are required to scan encrypted messages.

“Law incompatible with fundamental rights”

For critics of the proposed law, it is fundamentally a step in the wrong direction. Most recently, 36 politicians – mainly from the FDP and the Greens – appealed to the EU member states in an open letter to vote against the law. It is incompatible with European fundamental rights.

In general, another argument goes, many of the planned measures are not targeted and lead to false suspicions. Instead, more resources and better coordination of law enforcement authorities in Europe are needed.

“Don’t get to the root of the problem”

That’s all well and good, says Sebastian Fiedler, SPD interior politician and police officer, but that doesn’t get to the root of the problem. “I haven’t heard a single sensible alternative suggestion as to how to get this pandemic level of sexual violence against children under control,” says Fiedler.

He believes that there should be no cell phones in circulation that would allow anyone to “handle such material, i.e. open it, edit it or send it”. This is technically possible.

This, in turn, goes well beyond the proposals from Brussels. Even if these were to receive approval in the Council today, they still have a bit longer to go.

source site