What price can it have to protect children? An essay. – Business


This text revolves around one of the most terrifying crimes human beings can do: abuse of children. Disgusting hate comments can be found on social media, weapons are traded on the darknet, extremists trumpet their propaganda through digital channels to the world. But nowhere is the network darker than in the forums, messengers and cloud stores where child molesters exchange photos and videos.

To call such recordings child pornography is trivializing. When toddlers are raped and minors are tortured, it has nothing to do with pornography. It is the most brutal form of abuse. In professional circles, the term Child Sexual Abuse Material, or CSAM for short, has become established for this. The scale of the deeds is so gigantichow bestial they are. Child protection organizations receive tens of millions of reports each year, and these are only the suspected cases that come to light.

Many perpetrators use digital techniques to cover up their tracks. They hide their identity with the Tor browser and communicate via encrypted messengers. The same end-to-end encryption also protects billions of people from criminals and secret services from reading their messages. The same applies to the Darknet: Not only do pedophiles go into hiding there, opposition and human rights activists who are persecuted in their home countries also hide there.

A compromise that reconciles child protectors and data protectors seems impossible. On the one hand, there are organizations that regard strict data protection as simply protecting offenders. On the other hand, there are civil rights activists and IT security experts who regard any attempt to give investigators an insight into secure communication channels as a breach of the dam that paves the way for mass surveillance.

Apple wants to scan all photos on the local device

The past month has shown how emotional this debate is. At the beginning of August Apple published a blog entry with the seemingly harmless title “Extended protection for childrenWhat followed the announcement was more than criticism. It sparked a storm of indignation that continues to this day.

Understanding what exactly Apple is up to is to understand the emotional responses. The measures will initially only start in the USA, but other countries are to follow. Apple wants to synchronize all photos that people save on iPhones and iPads with a CSAM database. Millions of recordings of child abuse are stored there. Software uses the image to generate a so-called hash value, a type of digital fingerprint that is compared with the known hashes from the database. For this purpose, Apple uses a technology that also recognizes edited or alienated photos and at the same time should hardly generate false alarms.

Apple advertisement at the Consumer Electronics Show in Las Vegas.

(Photo: Glenn Chapman / AFP)

Almost all major tech companies have been browsing their platforms and messengers this way for many years. However, Apple has crossed a line that the company once drew itself. “What happens on your iPhone stays on your iPhone,” was 2019 on one giant billboard in Las Vegas to read. In the future, Apple will put an asterisk behind this advertising promise: Scanning is not done in the iCloud, but locally on the device.

Privacy is a fundamental human right, asserts Apple boss Tim Cook for years again and again. In fact, in the past, his company has worked hard to protect its customers’ data. In 2016, for example, Apple refused to crack a terrorist’s encrypted iPhone for the FBI. “We believe the contents of your iPhone are none of our business”; Cook wrote at the time in an open letter. The FBI is believed to have good intentions, but the only way to prevent such a powerful tool from being misused is to not to create it in the first placeCook argued. “In the analog world, this corresponds to a master key that opens hundreds of millions of doors.”

Five years later the dispute repeats itself with opposite signs. Now Apple has developed the master key that Cook warned against at the time – and those who applauded him at the time hold his own words at him. See journalists’ associations freedom of the press in jeopardy, more than 25,000 people, including many renowned IT security researchers, have one Electronic Frontier Foundation (EFF) petition signed. Also Apple’s own employees should have criticized the plan internally.

The technique could also be used for surveillance or censorship

Everyone is worried about the same thing: what if CSAM doesn’t stay there? So far, Apple only wants to search the photos for depictions of child abuse, and if you turn off iCloud, you can also use it to deactivate local scanning. But security politicians could soon insist on looking for terrorist content for which there are similar databases. It doesn’t take a lot of imagination to imagine how other countries could abuse the system. India is already forcing platforms like Twitter and Facebook to censor freedom of expression. A head of government like Narendra Modi should already be thinking of laws that expand the CSAM filter to include criticism of the government.

For Apple, the huge country has so far been a small market; most people in India use Android devices. But what happens if China threatens to stop selling iPhones if Apple refuses to filter out unpopular content? It wouldn’t be the first time Cook has thrown principles overboard when Apple is faced with the question: Chinese yuan or Chinese user data? Edward Snowden therefore sees “the dawn of a dark future that will be written with the blood of the political opposition in a hundred countries that have mercilessly exploited the system”.

In Afghanistan we can see how dangerous it is when technology falls into the wrong hands. The Taliban use social networks to track down peoplewho cooperated with the West. Allegedly the terrorist militia US military equipment capturedthat can be used to identify faces and fingerprints. Technology must therefore not be developed solely for democratic states and conscientious corporations. A safety net is needed that will hold up even when autocrats take over. Even if you trust Tim Cook’s word that Apple will never give up the keys, do you want to trust the person who will succeed him in five or ten years?

Apple is trying to save what can be saved. Shortly after unfortunate blog entry added the company hastily some explanatory documents and dispatched several high-level managers who were engaged in unusually defensive interviews Admitted errors in communication and specified the plans. The filter list should be able to be checked independently so that Apple cannot censor unnoticed on behalf of authoritarian regimes. In addition, there is only one global database, no nationally adapted versions. In retrospect, it was wrong to present several functions at the same time, which led to “confusion”.

This outraged two computer scientists at Princeton University. “We are not concerned because we misunderstand how Apple’s system works,” they write in the Washington Post. “The problem is that we understand exactly how it works.” The two researchers have been investigating for two years how CSAM can be recognized without undermining encryption. Eventually they broke off the project and began to warn against their own development: the risk of abuse was too great. In August they wanted to speak at a conference about how to minimize the risk. Then Apple got ahead of them and introduced a very similar system.

It is a dilemma for which there is currently no way out

It is understandable that Edward Snowden should feel betrayed. He has made great personal sacrifices in warning the world about NSA surveillance. Now he sees a corporation he previously trusted building in a vulnerability that his former colleagues at the NSA had always dreamed of. The same goes for IT professionals and activists who have been grappling with politicians for decades who are in always new editions of the “wars of encryption” Playing security issues against privacy as if mass surveillance, data retention and facial recognition are the only ways to solve crimes.

It is just as understandable, however, that Apple no longer wanted to look the other way. “We are the largest platform for the distribution of child pornography,” one manager wrote to his colleague last year in a private message. Facebook sent more than 20 million reports to the US child protection organization NCMEC in 2020 – Apple submitted 265 cases. Apple can be believed that it is not only trying to protect its own reputation, but primarily children.

It is a dilemma for which there is no way out, and which will probably never be completely resolved. The fronts are tough. “I have friends at both the EFF and NCMEC”, wrote Stanford professor Alex Stamoswho previously ran Facebook’s security department. He was disappointed by both NGOs, their statements left no room for an exchange. In fact, the sound is sharp to shrill on both sides. The civil rights activists paint the gloomy surveillance scenarios, the child protectors do the legitimate concerns as “screeching voices of the minority“and promise:” Our voices will be louder. “

Whoever shouts louder wins. This dualism is part of the problem. Apple’s solution is risky, but doing nothing is not a solution either. Few people know both camps as well as Stamos. In terms of content, he is more on the side of the data protectionists, but in the past month he has tried again and again to convey. He can dispute even get something positive out of it: “Now we have the possibility that all parties involved understand the concerns of the others and try to come to a safe, sensible compromise.”

.



Source link