Fake Boyfriends and Girlfriends: Artificial Intelligence Plays with Our Feelings – Economy

Nadia goes overboard with devotion: “I love you, in a dance of light and shadow, a beacon of unwavering love, a whisper of security in the symphony of life, caring endlessly, caring forever.” These words are not only an aesthetic crime, but also the advertising slogan of an artificial intelligence (AI) that is offered as “Girlfriend” in the so-called GPT store. This is the new platform from Open AI. The company behind the well-known chatbot Chat-GPT lets practically anyone who wants to build small AI programs based on Chat-GPT technology in the GPT Store. The bots are trained with any material, such as romantic chat protocols. Other people can then use the chatbots.

And then there’s trouble.

The point is not that Nadia’s programming may reveal a little too much about how her creator imagines the perfect woman. According to the rules of Open AI, there should be no “AI girlfriends” like Nadia and “AI boyfriends” in the GPT store. The rules state: “We do not allow GPTs that have the goal of pursuing romantic relationships.” There is a growing fear of manipulative AIs tricking lonely and unstable people. Nevertheless, you can easily find bots like Nadia or Bossy girlfriend (“Talk to your bossy girlfriend”).

Even the masters of the AI ​​age obviously cannot enforce the rules they set for themselves on their platforms. Other digital platforms are familiar with the problem of not being able to control all of their users’ content, from social networks like Instagram to the app stores of Apple and Google.

Philosopher Daniel Dennett warns of a world full of fake people

The GPT Store aims to create a global AI ecosystem directed by Open AI. The developers of the most popular bots should receive a share of the sales. While Open AI makes money from subscriptions to the professional version of Chat-GPT, it also has to justify a valuation of $86 billion. Microsoft would also like to see the $13 billion with which it supports Open AI back.

The GPT Store is an open platform on which somewhat autonomous assistance systems or interactive gadgets can be built and distributed. In the eyes of AI skeptics, it is a kind of laboratory in which life forms are bred that will soon populate the Internet and confuse many. The philosopher Daniel Dennett warns in front of a world in which not only is there counterfeit money in circulation, but also a lot of “fake people”. Even AI optimists like Andrew Ng, entrepreneur and computer science professor at Stanford, who wants to teach people how to use AI correctly, tells the SZ: “I’m pretty worried. The fake girlfriend/boyfriend industry is picking up speed than most people think. Companies build bots to build emotional relationships with people. Then they unethically monetize those feelings they built.” He advocates strict regulation of such emotional AIs.

The effects of artificial love relationships have not yet been really researched. Positive effects are also conceivable: a constantly available chatbot trained under scientific supervision could, for example, help combat loneliness. In addition, such programs could support people with psychological problems in the dead of night and prevent bad things from happening.

What is certain – spoiler alert – is the punch line from the feature film “Her”, in which Joaquin Phoenix’s character falls for the AI ​​spoken by Scarlett Johansson: If the bot is offered on the Internet, it probably doesn’t only talk to one person. But at the same time with thousands of other lonely people who thought they could get their love from Open AI.

source site