After offensive answers: Microsoft puts Bing chatbot on a leash

Status: 02/20/2023 09:36 a.m

He should bring the breakthrough for Microsoft’s search engine Bing: an AI chatbot. However, in his responses, he would become abusive, threatening users or telling them to break up with their partner. Now the group is drawing consequences.

US tech giant Microsoft has restricted the use of its Bing chatbot, which uses artificial intelligence (AI) to answer even complex questions and have in-depth conversations. The software group is thus reacting to a number of incidents in which the text robot got out of hand and formulated answers that were perceived as encroaching and inappropriate.

Microsoft announced that it will cap chat sessions on its new Bing search engine powered by generative AI to five questions per session and 50 questions per day. “Our data showed that the vast majority of people find the answers they are looking for within five rounds,” the Bing team said. Only about one percent of chat conversations contain more than 50 messages. If users reach the limit of five entries per session, Bing will prompt them to start a new topic in the future.

No longer conversations

Microsoft had previously warned against engaging in lengthy conversations with the AI ​​chatbot, which is still in a testing phase. Longer chats with 15 or more questions could lead to Bing “repeating itself or prompting or provoking answers that aren’t necessarily helpful or don’t match our intended tonality.”

Bing chatbot: “I can ruin you”

A test of the Bing chatbot by a reporter from the New York Times caused a stir on the Internet. In a dialogue lasting more than two hours, the chatbot claimed that he loved the journalist. He then asked the reporter to separate from his wife.

Other users had previously pointed out the chatbot’s “inappropriate responses”. For example, the Bing software told a user that it would probably choose its own survival over his. With another user, she insisted that it was 2022. When he insisted that 2023 was the correct year, the text robot became abusive.

The chatbot also threatened a philosophy professor, saying “I can blackmail you, I can threaten you, I can hack you, I can expose you, I can ruin you,” before deleting the threat himself.

Competition between chatbots

The new Bing, which has a waiting list of millions of users, is a potentially lucrative opportunity for Microsoft. The company said at an investor and press presentation last week that every percentage point of market share it gains in the search advertising market could generate an additional $2 billion in advertising revenue.

For its Bing chatbot, Microsoft relies on the technology of the start-up OpenAI, which is behind the chatbot ChatGPT, and supports the Californian AI company with billions. Microsoft boss Satya Nadella sees the integration of AI functions as an opportunity to reverse the market conditions in competition with the Google group Alphabet. He also wants to use AI to secure the supremacy of his office software and push the cloud business with Microsoft Azure.

Google has launched its own AI offensive with the chatbot Bard to counter the push by Microsoft and OpenAI. According to a report by “Business Insider”, CEO Sundar Pichai has asked his employees to push ahead with the further development of the system: They should invest two to four hours of their weekly working time in training the chatbot.

source site