New sexist porn, are deepfakes the new male of the century?

Last May, Anaïs discovered photos of herself on the Internet. She lies on her stomach, naked in front of a mirror, a pose highlighting her breasts. At first glance, this photo could be a real, slightly erotic selfie. Except that Anaïs has in no way decided to undress and in the original version, a towel completely hides the young woman’s body. “I in turn know the “joys” of the dark deranged people who retouch photos to masturbate on weird sites,” she then denounced on X (formerly Twitter).

However, Anaïs is not the only one to learn of the existence of naked photos of herself on the Internet. In addition to photomontages, there are now what are called pornographic deepfakes, fake images artificially generated from real photos of a person. As you will have understood, these hyper-realistic montages do not stop only at the irony of the photos of Emmanuel Macron picking up trash or the Pope in a down jacket. Deepfakes are most of the time malicious acts where the victims are exposed, even if it means displaying them in sexual positions… which have never been filmed.

“Marketing genius and its horror”

Although the malicious trend has exploded in recent months, it is not so new. “Pornography is generally the first testing ground for the latest technologies. It has been around for a long time, since 2017 or 2018,” underlines Laurence Allard, digital specialist, lecturer in communication sciences and researcher at Sorbonne Nouvelle Paris-3-IRCAV University. For the latter, this observation is not surprising: “The pornographic appears very realistic in its rendering. But by definition, it is a fetishistic, fantasy genre. It’s quite paradoxical to train yourself to develop perfect pornographic fakes knowing that you yourself are totally fake.”

As paradoxical as it may be, the phenomenon has intensified in recent months, particularly with the opening to the public of Artificial Intelligence sites such as the now famous Mid Journey. “They chose to have their tools developed by the crowd. Everything that involves learning, improvement, reinforcement is done by the public. From the moment these tools are in co-development and in the hands of the crowd, of course it can only improve the models, renderings, directories and databases,” underlines Laurence Allard. Before adding: “That’s the marketing genius of this thing and its horror because afterward it’s anything and everything that can be generated.”

Porn sites dedicated to deepfakes

The nonsense is more than visible online and even appears on pornographic sites where fake images featuring celebrities now become specific categories. A pornographic site dedicated to deep fakes has even been created since. Here the singer Ariana Grande giving oral sex, there a gang bang with the actress Margot Robbie. Images which have obviously never been shot and are created artificially. But American celebrities are not the only ones to pay the price. During her August vlogs, YouTuber Léna Situations warned of the appearance of false pornographic videos featuring her. Even more recently, journalist Salomé Saqué shared pornographic montages of her, imagining her topless. Describing the montages as “filthy”, she declared: “Not only are harassment and humiliation not decreasing on social networks, but they are intensifying”.

A first preconceived idea then arises around pornographic deepfakes. By exposing themselves, don’t public figures risk this type of phenomenon? It is possible that being present and visible to the general public increases the risks. However, the phenomenon of pornographic deepfakes affects the whole of society. In the south of Spain, in Almendralejo, 22 teenage girls attending different establishments in the town were victims of pornographic montages this fall. Half decided to file a complaint after these images were broadcast.

Women mainly victims

Whether public or private figures, there is a common denominator: the victims are predominantly women. According to a study carried out by the Deeptrace association, more than nine out of ten deepfake videos are pornographic videos. The people targeted “are in 99% of cases women”. To protect yourself from this, the solution would undoubtedly be to post as few photos or videos online on social networks. But on the one hand, is it up to the victims to protect themselves? On the other hand, certain platforms – like TikTok – allow much more freedom to use their users. “We can scrap [extraire des données, Ndlr.] TikTok, there we can have basic extractions of videos. As soon as there is a front door, scrapping is possible and therefore there are possibilities of reusing images in “image to image”.”

Today, pornographic deepfakes are still rarely the subject of condemnation. In France, article 226-8 of the Penal Code that “the fact of publishing, by any means whatsoever, the montage made with the words or image of a person without their consent, if it is not obvious that it is a question of ‘an assembly or if it is not expressly mentioned’ is punishable by one year’s imprisonment and a fine of 15,000 euros. In an interview given to West France, the Minister responsible for Digital Jean-Noël Barrot stressed that the bill aimed at securing and regulating the digital space would aim to toughen the consequences. The penalties would be “increased to three years of imprisonment and a fine of €75,000 when the publication of the montage was carried out using an online public communication service”.

source site