AI Pornography on the Rise: Two Students Invented “Claudia”

Artificial intelligence
“You are very pretty” – two students invented a woman and deceived others with suggestive images

Images like this are slowly flooding the internet. At first glance you see an attractive woman – but the person shown does not exist. This image was created artificially. (icon picture)

© Addictive Stock / Pau Novell / Imago Images

Artificial intelligence is omnipresent. In the meantime, realistic images can also be created very easily. Two students used this technology to make money.

Three months ago, “Claudia” reported for the first time on the “Reddit” discussion platform. The person behind the account drew the picture of a revealing young woman who reveals her intimate bed stories. The first picture followed after about a month. It showed a woman on a forest path, dressed in a mini skirt, pantyhose and a crop top with a generous neckline. You can’t see the face, but the picture got attention.

Another month later, a picture of the woman’s back followed. It shows a thong, a bra and lots of skin. Again no face. That came two weeks later. “Claudia” showed up in a sub-forum for portraits and again got many compliments. But she went a step too far.

For the first time, a few users noticed that it might be fake footage. It didn’t help that the person who runs the “Claudia” account published a nude picture a little later. Even if in the comments under the explicit content again no one complained that what you see there could not be real.

Two students invented AI “Claudia”.

Until the end, it remained a mystery who is behind “Claudia” – and whether it might not be a real, very attractive woman who presents herself so freely to strangers. In conversation with the “Rolling Stone“The two men who invented “Claudia” then revealed themselves: “Claudia” is one created by the AI ​​tool Stable diffusion created a creation that two students tried to earn money with. And that worked.

The idea for this project came about when the two heard about a case in which a Reddit user reported on his financial success with the distribution of stolen pictures of real women. They wanted to test whether this also works with artificially created images whose copyright, unlike stolen images, is at least unclear and perhaps even belongs to the creators. So far it hasn’t really been clarified.

Until “Claudia” blew up, the two students said they earned $100 by selling commissioned work from the fake person. Claudia’s creators told Rolling Stone: “You could say that this whole account was just a test to see if you could fool people with AI images. You could compare it to the vtubers that their create your own characters and play as a completely different person. We honestly didn’t think it would resonate so much.”

Vtubers refer to virtual people who, for example, attract many viewers on streaming platforms such as Twitch and whose “owners” usually do not identify themselves. There are talking animals, young women with a manga look and much more – quite a few companies are now taking a liking to such artificial influencers and are concluding lucrative advertising deals with the operators. “lilmiquela” has 2.8 million followers on Instagram, for example, although they don’t exist.

Many AI tools set tight limits – but not all

With the advent of very simple tools, such as Stable Diffusion, which also place relatively few limits on image creators, a case like Claudia’s may be just the beginning. The two students also confirm this and say that there will be more and more such accounts. With no regulation or labeling requirements, it becomes increasingly difficult to separate reality from fiction.

As far as lewd content goes, many image generators block most attempts to create images. Even Stable Diffusion puts a stop to attempts to create pornographic scenes on the main page. However, if you move within the permitted framework, it is easy to design entire illustrated books of artificial people. For everything that goes beyond that, alternatives are being created, for example “Unstable Diffusion”, as the specialist magazine “golem” writes.


source site-5