How AI is being abused for child pornography


exclusive

As of: January 5, 2024 6:08 a.m

Artificial intelligence is increasingly being misused to create child pornography, such as full screenResearch shows. Some images are artificially generated, others are based on real images of children or even real child abuse.

By Fabian Sigurd Severin, SWR

Thousands of artificially generated images of children and young people are distributed via the social media platform Instagram under certain hashtags. Many are depicted in skimpy underwear, swimwear or sexualized poses. This is what research shows SWR-Investigative format Full screen.

Some accounts that share such images link to trading, crowdfunding or community platforms. Images of explicit child sexual abuse created using artificial intelligence (AI) are sometimes distributed there. Experts warn of the danger of AI-generated images of abuse: They make investigative work more difficult and could increase the willingness of pedophiles to commit real attacks.

Trading AI-generated images from Child abuse

Among the linked community platforms is Full screen-According to research, a Japanese website shares explicit AI-generated images of child abuse. It is also known to the BKA and, according to user comments, appears to be used by people with pedophilic disorder to network.

Several users link from there to a website that contains real images of child abuse. This is confirmed by the “Internet Watch Foundation”, an international reporting center for child sexual abuse based in Great Britain.

AI-generated images based on real child abuse are also circulating. This is indicated by user comments under AI-generated images, as well as observations by the Internet Watch Foundation.

Missing Security mechanisms

“People sell access to such images or get paid to create them,” says Dan Sexton from the British NGO. The organization has so far found most images of AI-generated child abuse on the Dark Web. According to Sexton, as the number increases, the risk increases that more and more of them will end up on the open web. “This is not something that might happen in the future, but something that is already happening now.”

The artificial images of child and youth pornography are produced Full screen-According to research, primarily with a version of the AI ​​program Stable Diffusion. In contrast to the two major competing AI programs DALL-E and mid-journey, which are used to create images, Stable Diffusion is open source software and its code is publicly accessible. The software version does not include any security mechanisms that prevent, for example, the creation of nude images. This is shown by a self-experiment Full screen-Editorial staff.

Potential overload for authorities

The BKA does not separately record cases of AI-generated pornography, but classifies the overall risk of child and youth pornography as high. In 2022, the number of cases increased by 7.4 percent for child pornography and by 32.1 percent for youth pornography. In addition, everyday, real photos could serve as the basis for AI-generated pornography. According to the BKA, the artificial images can hardly be distinguished from real ones.

“What I find worrying is that the number of available materials will increase, but above all the quality of the content,” says Senior Public Prosecutor Markus Hartmann. He heads the cybercrime central and contact point (ZAC) in North Rhine-Westphalia. According to Hartmann, the development could lead to investigative authorities incorrectly assessing AI-generated images as new actual abuse and thus reaching their resource limits.

AIChild pornography can trigger perpetrators

But the offer of artificially generated child pornography is also a danger for people with pedophilic disorders, says Professor Klaus Michael Beier, director of the Institute for Sexual Sciences and Sexual Medicine at the Charité in Berlin. He also runs the Berlin location of “Don’t Become a Perpetrator”, a support service for pedophiles.

The problem: Like real child pornography, AI-generated images also lead to a distortion of perception, says Beier. They tricked pedophiles into believing that sexual contact between children and adults was possible and that children even desired it.

His warning is supported by an international study by the Finnish NGO “Protect Children”. More than 8,000 people who consume images of child abuse on the Darknet took part. Around a third of those surveyed stated that they had actually sought contact with children after viewing the illustrations.

Possible legal loophole in Germany?

In Germany, the distribution, acquisition and possession of images of child or youth pornography is prohibited according to Section 184 b and c of the Criminal Code. But since there is so far hardly any case law on AI-generated child and youth pornography, this creates uncertainty, according to the Justice Minister of Rhineland-Palatinate, Herbert Mertin (FDP): “One problem could be that simply producing it without distributing it want to go unpunished if they do it with artificial intelligence. If you do it with real children, then it is punishable.”

The Conference of Justice Ministers therefore asked Federal Justice Minister Marco Buschmann (FDP) to set up a commission of experts to deal with the new developments, says Mertin.

Required by EU law Platform provider

A spokeswoman for the Federal Ministry of Justice writes down Full screen-Request that the “criminal law instruments” are continually examined. In addition to real ones, “child and youth pornographic representations regularly generated using AI” are also punishable. If illegal content is distributed via online platforms such as Instagram or the aforementioned Japanese website, the Digital Services Act would apply. The EU law obliges platforms to set up reporting procedures and against to take action against improper use of their services.

Spokespersons from Instagram and the Japanese community platform position themselves Full screen-Inquiry clearly against child pornography. Instagram not only takes action against explicit sexual content, but also against profiles, pages or comments that share images of children that do not have a clearly sexual connotation, if the captions, hashtags or comments contain inappropriate signs of affection.

Platforms respond inadequately

Full screenHowever, research shows that Instagram and the Japanese website are inadequately complying with the obligation to remove illegal content. During the course of the research, the editorial team reported several dozen Instagram accounts whose owners advertised selling real child and youth pornography.

Only a third of accounts were deleted by Instagram within 48 hours. For the others, no violation of the community guidelines could initially be identified. The remaining accounts were only deleted after further notice.

A spokeswoman for the Japanese website writes that the from Full screen Reported AI-generated images of child and youth pornography have been deleted. But even after their answer, comparable AI-generated images of child abuse can be found on the community platform.

“We have appropriate legal framework regulations in Germany that can be used to remove something like this from the internet,” says Rhineland-Palatinate Justice Minister Mertin. “Our problem is always to make the polluter manageable.”

Many perpetrators are based abroad and are therefore difficult to catch. In addition, international cooperation is sometimes difficult. Chief Public Prosecutor Hartmann sees the problem above all in the fact that it is not easy for platform operators to recognize corresponding images.

You can watch a television documentary on this topic from Monday, January 8th at 6 a.m. in the ARD media library.

Thomas Becker, SWR, tagesschau, January 5th, 2024 4:00 p.m

source site