More and more American teenagers victims of fake nudes

In its wake, artificial intelligence carries many dangerous misuses. Particularly fake nude photographs. Ellis, a 14-year-old American teenager, woke up one October morning to several missed calls and frantic messages on her phone. False photographs of her, naked, were circulating on social networks.

“They were photos of one of my best friends and me from Instagram. Naked bodies had been edited on it,” says the schoolgirl. “I remember being very, very scared because these fake nudes were being sent around me. And I’ve never done anything like that. » Like several classmates from her college on the outskirts of Dallas, Texas, the young girl was the victim of hyperrealistic montages (deepfakes) of a sexual nature, made without her consent by a student and then shared on the Snapchat network.

“Mom, it looks like me”

With the popularization of artificial intelligence (AI), it has become easier to make these photo or video montages and make them hyperrealistic, paving the way for their use for harassment or humiliation. “The girls were just crying and crying and crying, they were ashamed,” recalls Anna Berry McAdams, immediately informed by her daughter Ellis.

“She said to me: ‘Mom, it looks like me’, and I replied ‘but no my darling, I was with you when the photo was taken and I know that’s not the case'”, continues the fifty-year-old, who says she is “horrified” by the realism of the photos. At the end of October, other deepfakes of a sexual nature were discovered in a high school in New Jersey, in the northeast of the United States. An investigation was opened to identify all the victims and the perpetrator(s).

“Anyone” can make them today

“I think this will happen more and more frequently (…) and that there are victims who don’t even know that they are victims and that there are photos of them,” laments Dorota Mani, mother. of one of the identified students, also aged 14. “We are starting to see more and more cases emerging (…) but in terms of sexual abuse, “revenge porn” [divulgation malveillante d’images intimes] or “pornographic deepfakes”. Many people do not come forward and suffer in silence because they are afraid to make the matter public,” explains Renée Cummings, criminologist and artificial intelligence researcher.

While it is impossible to assess the extent of the phenomenon, “anyone with a smartphone and a few dollars can now create a deepfake,” specifies this professor from the University of Virginia. And this is thanks to recent progress and the democratization of generative AI, capable of producing texts, lines of code, images and sounds upon simple request in everyday language.

Targeted women and girls

In fact, hyperrealistic montages, which previously concerned the image of celebrities “having piles and piles of photos and videos of themselves online”, now concern everyone, explains Hany Farid, professor at the University of California at Berkeley. “If you have a LinkedIn profile with a photo of your head, someone can create a sexual image of you,” continues this specialist in detecting digitally manipulated images, noting that these montages “mainly target women and young girls “.

Faced with this growing threat, the school and judicial systems appear overwhelmed. In the United States, no federal law punishes the production and transmission of false sexual images, and only a handful of states have specific legislation. At the end of October, Joe Biden urged legislators to establish safeguards, in particular to prevent “generative AI from producing child criminal content or non-consensual intimate images of real people”.

A “very real trauma”

Because if these images are false, “the trauma is very real”, adds Renée Cummings, describing people “suffering from anxiety, panic attacks, depression or even post-traumatic syndromes after having been victims of pornographic deepfakes “. “It can destroy a life.”

Texan Ellis, who describes herself as a “social” and athletic teenager, says she is now “constantly afraid”, although the student behind the deepfakes has been identified and temporarily excluded from school. “I don’t know how many photos he was able to take, nor how many people were able to receive them,” she explains, saying she asked to change schools. Faced with this unknown, his mother mobilizes to have these images recognized as “child pornography” and to see them punished as such.

source site