Chat control in the EU: Stay away from Whatsapp, Signal and Threema! – Business

If you have nothing to hide, you have nothing to fear. Supporters of surveillance measures like to justify stricter laws. Mark and Cassio show how dangerous this argument is. Both fathers described theirs experiences of New York Times. They worried about their sick children, consulted a doctor, sent photos of the genital area at his request – and faced a horrific suspicion that they had illegally distributed nude photos of their own children.

The police have dropped investigations in both cases, but the allegations have long-term consequences for the men who were wrongly suspected. They can no longer access their Google accounts and have lost all their emails, documents, contacts and photos from more than a decade. Because that’s how Mark and Cassio came into the focus of the investigators in the first place: their Android phones saved the pictures on Google Photos, where they were automatically classified as alleged depictions of child abuse.

Surveillance system with downsides

Mark and Cassio are two of more than 270 000 Google userswhose accounts were suspended last year for allegedly making or distributing child abuse footage. The software sounds an alarm, a Google employee reviews the photos or videos and notifies a child protection organization, which alerts law enforcement agencies. Child abuse is one of the most horrific crimes human beings are capable of committing. It is good that perpetrators are identified and children are protected. But when the suspicion falls on innocent people, the dark side of the surveillance system becomes apparent.

What the two fathers experienced allows three conclusions. The first concerns all people who keep most of their digital lives with a provider like Google. Through no fault of your own, everything can be gone in one fell swoop. To this day, Google has refused to recover the two men’s accounts. It is best to distribute your e-mails, documents and photos to different services or at least download backup copies of the data regularly.

The second concerns all parents and grandparents. If you photograph naked or half-clothed children, you have to be careful. When two-year-olds splash around happily in the pool, the souvenir video may be harmless to the parents – it may be suspicious for software. It is just as risky to document the course of an inflammation in the groin area with photos on the advice of a pediatrician. You should never save such images in cloud services, only send them with a securely encrypted messenger and then delete them again.

No more privacy in messengers

These precautionary measures could soon be insufficient. Because, and this is the third lesson from the experiences of Mark and Cassio, the EU Commission is striving for a form of mass surveillance without cause that deeply invades the privacy of hundreds of millions of people. If the plans go ahead, Messenger would need to screen all message content to detect child abuse, using similar software that Google, Facebook and other companies are already using to scan unencrypted photos.

This would also affect providers such as Signal or Threema, who intentionally encrypt chats in such a way that they cannot read them under any circumstances. Whatsapp records who communicates with whom, when and from where, and also shares this information with the parent company Meta, but cannot access the content of the messages either. The EU Commission fears that pedophiles will use encryption to sneak up on children or exchange illegal material.

Chat control goes too far

Against the so-called chat control are not only civil rights activists and privacy advocates. The German federal government around Interior Minister Nancy Faeser and Digital Minister Volker Wissing also criticized the project as disproportionate. The two cases from the USA show that the concerns are justified. Even the European Commission admits that the software is extremely error-prone. That means masses of sensitive messages and intimate photos of innocent people ended up with an EU authority. No one can be sure anymore that private communications will remain private.

Most perpetrators use other methods anyway to trade the illegal material. Law enforcement fails not because of too little data, but because of too few resources: trained investigators and authorities with technical expertise. The fight against child abuse justifies many means, but not all. The planned chat control goes a decisive step too far. Even those who have nothing to hide would suddenly have something to fear.

source site