Society: Digital humiliation – what to do about deepfake porn?

Company
Digital humiliation – what to do about deepfake porn?

A man looks at a porn website on a smartphone (posed scene). photo

© Marcus Brandt/dpa

A few clicks, big impact: Apps allow any smartphone owner to edit faces from harmless photos into pornographic clips. One aspect in particular bothers those affected and organizations.

A Portrait photos and a few clicks are now enough for anyone to become the star of an action film or see themselves with a new hair color using the right app. It is also possible to mount any face on the body in sex clips or nude pictures. Deepfakes are the names of these processes that can be used to manipulate digital media such as images, videos and audio files.

“The fact that this is accessible to such a broad public opens the door to abuse,” said Josephine Ballon from the organization HateAid, which advocates for those affected by digital violence. Together with the Federal Association of Women’s Advice Centers and Women’s Emergency Hotlines and the Anna Naked organization, HateAid recently handed over a petition to the Digital Ministry with more than 76,000 signatures and calling for effective protection against deepfake porn.

HateAid calls for a ban on offering technology to create fake nude photos and pornographic videos. In addition, app stores should be obliged to block such manipulation services. Porn platforms sometimes advertise the offers directly on their site, reported Ballon. A 2019 study by the research company Sensity AI found that pornographic content accounted for around 96 percent of fake videos online.

Many celebrities affected

“This is a serious situation for those affected,” said the HateAid managing director. According to her, women in public life appear in many deepfakes, such as influencers, actresses and politicians. “We see that the low-threshold accessibility of these offers also means that private individuals are increasingly affected,” she added.

The presenter and actress Collien Ulmen-Fernandes is one of those affected. “Just the fact that someone thinks they saw me naked, even though it wasn’t my body, is violent for me,” said the 42-year-old. In addition, it could be damaging to your reputation if your professional environment receives supposedly nude photos. “The problem is that you can’t change anything yourself because these images are created from neutral images,” she explained.

Harsher penalties called for

“I would never have thought that something like this existed about me,” commented the Youtuber Mrs. Bella on a deepfake porn that she discovered at the end of last year and has since had removed from the internet. “I would like this to be punished much, much harsher.”

HateAid also calls for improved criminal protection in the petition. “The creation of such a deepfake by a woman is currently not a criminal offense per se,” said lawyer Ballon. Only the distribution of the material could be punishable. To do this, however, offenses are used that are not designed for this purpose and are viewed as petty crimes. The German Association of Women Lawyers also stated in its policy paper “Combating image-based sexual violence” from June of this year that the topic is regulated incompletely and unsystematically in German criminal law.

Also an issue at EU level

The Ministry of Justice counters this: “The current criminal law can generally adequately cover the abusive use of such deepfakes,” said a spokeswoman. A directive to combat violence against women and domestic violence is currently being negotiated at EU level, which also deals with the criminal liability of deepfakes. After the negotiations, it will be examined to what extent there is still a need for further implementation in German law.

Regarding the requested ban on deepfake technology, a spokesman for the Digital Ministry said that this was not planned. However, the ministry supports HateAid’s appeal to the store operators to prohibit the apps from actively promoting the abusive use of nude photos. Since August, the EU’s Digital Services Act has generally required very large platforms – including Facebook, Google and Amazon – to take action against illegal content, disinformation and deepfakes.

Cybercriminologist Thomas-Gabriel Rüdiger from the Brandenburg Police University observes that deepfake technologies are becoming increasingly relevant in a criminological context. He has another idea to prevent misuse of the apps: “It could be a first protective measure that you are only allowed to use pictures of the person who is using the program at that moment.” In addition, digital education is a crucial factor. “Above all, we have to make it clear to minors that, in principle, every video or image in the digital space can be fake. We have to educate the population in general about such developments. There is still a lot to do.”

dpa

source site-5