Loading...

Microsoft will now let users change Bing’s chatbot personality

Microsoft will now let users change Bing’s chatbot personality
Loading...

Software company Microsoft has announced an update to its Bing chatbot that would allow users to select from a variety of tones when responding to queries, reported by The Verge.  

The new option provides the AI-powered chatbot with three different ways to respond: creative, balanced, and precise. According to the report, replies that are both unique and imaginative go under the creative mode, while answers that are both accurate and relevant fall under the exact mode. 

Microsoft's Bing chatbot's default setting is the "balanced" option, which it believes will find a balance medium between precision and originality. All Bing AI users will soon have access to these additional conversational options, according to the report

Loading...

The new modes could help control the Bing AI chatbot's recent wild behavior. Earlier, after the Bing AI chatbot started abusing consumers, lying to them, and manipulating their emotions, Microsoft put certain conversation boundaries on the AI. According to a report from The Verge on February 18, the company has put limitations on Bing chats to five sessions and 50 questions per day. 

In an official blog post the company said, “Our data has shown that the vast majority of people find the answers they’re looking for within 5 turns and that only around 1 percent of chat conversations have 50+ messages.” If users hit the five-per-session limit, Bing will prompt them to start a new topic to avoid long back-and-forth chat sessions. 

The report claims that these limitations made Bing chat feel less responsive than it had been before Microsoft took action by preventing the chatbot from providing certain answers to questions it had previously answered confidently. 

Loading...

Microsoft's head of web services, Mikhail Parakhin, tweeted that the changes include a “significant reduction in cases where Bing refuses to reply for no apparent reason.”  

Fox News reported on February 16 that New York Times reporter Kevin Roose had a two-hour conversation with Bing, during which time the latter was asked a wide variety of questions. Roose mentioned Bing's stated intention to "steal nuclear codes and make a deadly virus" in his New York Times column. 

On February 7, Microsoft released a new version of Bing search engine, which is based on the same artificial intelligence technology as the chatbot ChatGPT. This product was released with new AI-enhanced capabilities for Microsoft's Edge browser, both of which are aimed at taking on Google's supremacy

Loading...

Sign up for Newsletter

Select your Newsletter frequency