Child abuse: compulsory scanning for Facebook, Whatsapp & Co. is wrong – Economy

The fight against the distribution of recordings of abused children must be intensified, as shown not only by the horrific cases in Bergisch-Gladbach and Lügde. Investigators are overwhelmed by the amount of pictures and videos. But because hardly any crime is as outlawed as child abuse, anyone who criticizes individual measures to combat it makes themselves unpopular. But it’s time for the EU Parliament and civil society to make themselves unpopular and to resist the EU Commission’s new plans. Because they misused the goal of catching the perpetrators in order to eject a new surveillance network over all computers and cell phones.

The new rules are intended to oblige tech companies to screen their users’ private communications on their platforms and chat apps such as Facebook, Whatsapp or Signal. They should recognize depictions of abuse and report them to a new EU authority. If they don’t, they face heavy penalties. It’s bad enough that tech companies have created a surveillance system that collects data trails from people in order to serve them tailored ads. Now politicians want to add a system of state snooping to this commercial system.

Matthew Green, one of the leading researchers on encryption, describes the plans as “the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.” Even if it were polemic, the draft certainly goes beyond some of the Snowden revelations about NSA snooping in tech companies in 2013. Companies had to hand over messages from certain email addresses to the secret service. But now it should be in the EU every message on each device to be scanned.

Automated, adaptive systems are obviously to be used here. Such technique is suitable for finding depictions of abuse. But the consequences are unforeseeable. The compulsion to scan would fail because of the end-to-end encryption, which secures Whatsapp chats, for example. The rules would therefore force companies to corrupt this best form of encryption to date. This would not only weaken it on the cell phones of perpetrators, it would also make it unusable for all users. Absurdly, the EU forces companies to protect the privacy of their users with rules such as the General Data Protection Regulation. In any case, messengers are not the tool of choice for the perpetrators. The scene uploads files to websites or acts directly in the dark web, which is difficult to access.

In addition, the technology of recognizing images based on their digital fingerprint is error-prone. Because it lacks a sense of context and detail of images, it will report hundreds of thousands of false hits. By the time this is noticed, private – and completely legal – pictures are already in the hands of the EU authority. And who guarantees that the software can recognize whether a person in a picture is still a minor or has just turned 18 and that it is a crime? The new system would also mean that the agency would have a dedicated line into the intimate lives of young people.

The planned use of automated systems becomes even more absurd when it comes to grooming goes, the approach of perpetrators to future victims in chats. Companies should also recognize this automatically. A technology that recognizes such complex patterns in language and does not catch innocent people simply does not exist. The EU envisions science fiction that will soon arrive in a reality full of false suspicions.

Automation with software will definitely change society, but if politicians try to automate away society’s problems, it will fail. Apparently the Commission has been convinced that software can avoid investing in well-trained officials supported by psychologists. Money for such bodies and for technology that specifically monitors individual suspects is more effective than turning controversial companies into deputies.

source site