Fake sex videos: It can happen to anyone


Exclusive

Status: 11/29/2022 5:00 p.m

Millions of women are victims of fake sex videos and pictures online. In the past, it was mainly celebrities who were affected by deepfakes. One SWR-Research shows why more and more private individuals are coming into focus.

Why me? What have I done to you now that you take my pictures and you’re like, ‘Yeah man! Let her be humiliated!'”, the angry woman asks in the video call with the reporter. She discovered private pictures of the young woman in a porn forum, contacted her and drew her attention to it. A former classmate had presumably uploaded the photos shortly beforehand, she show her as an underage girl.”Fake her!” (English: “Fake her!”) – the call under the pictures is unmistakable: other users are asked to use the private pictures of the young woman as templates for pornographic deepfakes. Entire picture galleries were uploaded to the porn forum not just by her, but also by several of her former classmates.

According to experts, the number of deepfakes circulating on the Internet has literally exploded in recent years. Deepfakes are photos or videos that are manipulated using artificial intelligence (AI). Images are fused with recordings of a specific person in such a way that they look authentic. According to studies, 96 percent of them show pornographic content. The victims of the manipulations are almost always female. As early as 2019, researchers recognized in a study by the company Deeptrace on deepfake technology that it can and will be used primarily as a weapon against women.

So far, however, it has mainly been the faces of famous actresses, presenters, singers, politicians and other public figures that have been edited into pornographic films SWRinvestigative format FULLBILDthat private individuals are increasingly becoming easy victims of deepfake pornography and thus of humiliation, defamation and blackmail.

The technology for creating deepfakes is now also easily accessible to laypeople, for example with open source tools from the Internet or mobile apps from the Google Play Store or Apple Store. According to AI expert Henry Ajder, who also led the 2019 investigation, abuse increased sharply. In the course of the research, VOLLBILD tested how easily deepfakes can now be produced by anyone: With the premium version of a deepfake app, it was possible to mount a foreign face in a porn video within a few minutes without any previous knowledge.

Girls and women are increasingly affected

In 2018, researchers counted a good 14,000 deepfake porn videos circulating online, but today it is no longer possible to name a number, says Ajder. “We’re probably talking about billions of deepfakes that have been created in the meantime,” the expert estimates. “We are talking about millions of women who are now being targeted in relation to image abuse. While we cannot provide statistics, we can say with certainty that this technology has exploded, including in terms of its malicious use. “

In an interview with VOLLBILD, the influencer and YouTuber “Mrs. Bella” talks about her experiences as a victim of deepfake pornography. When she saw a photo of a naked woman with her face for the first time, she was shocked: “She looks like me, but that’s not me because I don’t take a photo like that,” she says. Confronted with a new deepfake porn video that was circulating about her, she was shocked at how deceptively real the manipulations look. She now wants to take legal action against the distribution of the video.

Perpetrators are often men from the circle of acquaintances

But who creates deepfake pornography of private females? Research shows that many men use deepfakes to satisfy their pornographic fantasies. Some are concerned with humiliating and degrading women and girls from their private environment. As consultant Anna Lehrmann from the association “Women help women” in Fürstenfeldbruck reports, deepfakes are also associated with other forms of violence such as blackmail or threats. Men often use them as a weapon in relationship disputes. “A lot of women also say: ‘I don’t want to do anything about it because I’m so afraid that it will make it even worse’,” says Lehrmann. Anyone who comes to her for advice would rather ignore the crime than take legal action – for fear of further provoking the perpetrator.

In order to find both perpetrators and victims of deepfakes, the VOLLBILD team researched online forums and communities for weeks. With just a few clicks, publicly accessible albums full of private photos of hundreds of women can be found on the website – some of which have already been manipulated, showing them supposedly naked and some even having sex. Sometimes other users are asked to deepfake the uploaded images.

The investigative format VOLLBILD was able to locate some of the women affected. Reporter Filippa von Stackelberg contacted several of those affected, most of whom had not previously known about the deepfakes, were shocked by the information and did not want to comment. The reporter also wrote to some of the alleged perpetrators. Some then deleted their accounts and galleries. One user admitted to having previously had a relationship with the woman he uploaded manipulated nude photos of.

A group of young women from northern Germany who had gone to school together and whose private pictures the reporter also found in one of the porn forums were willing to talk. In the video call, they expressed their anger and were also visibly shocked. The group decided: The case will be reported together.

Deepfakes are not a criminal offense in their own right

Josephine Ballon, lawyer at the non-profit organization HateAid, which advises and supports those affected by digital violence, explains the extent of the damage that victims suffer from such deepfakes. “Once the picture is in the public domain, it will be shared. Even if you were able to get it deleted, others might just continue to share it on other platforms, even save it to post it again later. That can for those affected mean that they have to search for their pictures for a lifetime and ensure that they are deleted.”

When asked by VOLLBILD, the Federal Ministry of Justice stated that they were taking the issue “very seriously” and wanted to take action against deepfakes. A key issues paper for a law against digital violence will soon be presented. “The aim is to improve legal protection for those affected by digital violence,” said a spokeswoman.

Upon request, the State Criminal Police Offices announced that deepfakes have not yet constituted a criminal offense in their own right and that they are therefore not specifically recorded. For example, deepfakes would fall under several offences, such as violations of the Art Copyright Act or the violation of personal rights, but the depiction of sexual abuse of children and young people or insults would also come into question.

source site