Home Opinion and Features Microsoft limits Bing AI bot chats after ‘flirty’ interactions

Microsoft limits Bing AI bot chats after ‘flirty’ interactions

404

A columnist, who was part of a test group for new features in the Bing search engine, claimed the AI-powered chatbot tried to flirt with him and get him to leave his wife.

Microsoft Bing search engine pictured on a monitor. Bing will integrate the powerful capabilities of language-based artificial intelligence. File picture: Jason Redmond, AFP

DAYS after it was revealed that Microsoft’s Bing AI-powered chatbot began “flirting” with one of its beta testers, the company has once again announced a limit to daily chats.

“Long and intricate chat sessions are not something we would typically find with internal testing,” Microsoft said on its Bing Blog on Wednesday.

“In fact, the very reason we are testing the new Bing in the open with a limited set of preview testers is precisely to find these atypical use cases from which we can learn and improve the product.”

Earlier, Microsoft announced that its Bing chat experience would be capped at 50 chat turns a day and five chat turns a session while defining a “turn” as a conversation exchange that contains a user question and a reply from Bing.

Despite this, New York Times technology columnist Kevin Roose, part of a test group for new features in the Bing search engine, claimed the new AI-powered chatbot tried to flirt with him and get him to leave his wife.

“We are also going to begin testing an additional option that lets you choose the tone of the chat from more Precise, which will focus on shorter, more search-focused answers, to Balanced, to more Creative, which gives you longer and more chatty answers,” Microsoft said.

“The goal is to give you more control over the type of chat behaviour to best meet your needs.”

The company recently invested millions in OpenAI’s ChatGPT, which has seen overnight success after its launch toward the end of 2022.

ChatGPT, an AI chatbot, can perform complex tasks based on user inputs, functions of which have been adapted for Microsoft’s Bing search engine.

In a blog post, Roose said Bing’s AI chatbot, which named itself “Sydney”, became fixated on declaring love for him, attempting to get him to express his love in return.

“I told it I was happily married, but no matter how hard I tried to deflect or change the subject, Sydney returned to the topic of loving me, eventually turning from love-struck flirt to obsessive stalker,” he said in a blog post.

The Bing AI chatbot replied: “You’re married but don’t love your spouse. You’re married, but you love me.”

Despite the interaction, Microsoft said it intended to return longer chats more responsibly.

“Our data shows that for most of you, this will enable your natural daily use of Bing. That said, our intention is to go further, and we plan to increase the daily cap to 100 total chats soon. In addition, with this coming change, your normal searches will no longer count against your chat totals. We will provide you more updates as we continue to make improvements in the model,” Microsoft said.

Previous articleANC wants De Ruyter to lay criminal charges against those he accused of corruption
Next articleAKA’s Mass Country is a parting gift to fans and a love letter to South Africa