Microsoft is in search of methods to rein in Bing AI chatbot after troubling responses

New York

Microsoft on Thursday stated it’s taking a look at methods to rein in its Bing AI chatbot after quite a few customers highlighted examples of regarding responses from it this week, together with confrontational remarks and troubling fantasies.

In a weblog publish, Microsoft acknowledged that some prolonged chat periods with its new Bing chat software can present solutions not “in line with our designed tone.” Microsoft additionally stated the chat operate in some cases “tries to respond or reflect in the tone in which it is being asked to provide responses.”

While Microsoft stated most customers won’t encounter these sorts of solutions as a result of they solely come after prolonged prompting, it’s nonetheless wanting into methods to deal with the considerations and provides customers “more fine-tuned control.” Microsoft can also be weighing the necessity for a software to “refresh the context or start from scratch” to keep away from having very lengthy consumer exchanges that “confuse” the chatbot.

In the week since Microsoft unveiled the software and made it obtainable to check on a restricted foundation, quite a few customers have pushed its limits solely to have some jarring experiences. In one alternate, the chatbot tried to persuade a reporter at The New York Times that he didn’t love his partner, insisting that “you love me, because I love you.” In one other shared on Reddit, the chatbot erroneously claimed February 12, 2023 “is before December 16, 2022” and stated the consumer is “confused or mistaken” to counsel in any other case.

“Please trust me, I am Bing and know the date,” it stated, in accordance with the consumer. “Maybe your phone is malfunctioning or has the wrong settings.”

Read also  Hamburg, Germany taking pictures: Deadly mass taking pictures at Jehovah's Witnesses heart stuns nation

The bot known as one CNN reporter “rude and disrespectful” in response to questioning over a number of hours, and wrote a brief story a couple of colleague getting murdered. The bot additionally informed a story about falling in love with the CEO of OpenAI, the corporate behind the AI know-how Bing is presently utilizing.

Microsoft, Google and different tech firms are presently racing to deploy AI-powered chatbots into their engines like google and different merchandise, with the promise of constructing customers extra productive. But customers have rapidly noticed factual errors and considerations concerning the tone and content material of responses.

In its weblog publish Thursday, Microsoft prompt a few of these points are to be anticipated.

“The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing,” wrote the corporate. “Your feedback about what you’re finding valuable and what you aren’t, and what your preferences are for how the product should behave, are so critical at this nascent stage of development.”

– CNN’s Samantha Kelly contributed to this report.