Content
summary Summary
Update
  • Microsoft is reducing the number of chat interactions to avoid longer conversations that are more likely to lead to chatbot escalations.

Update, February 18, 2023:

Ad

Microsoft is responding to increasingly critical reports in editorial and social media about confusing or partially malicious responses and reactions (see below) from Bing Bot.

Because these occur primarily during longer chat sessions, Microsoft says the number of messages within a chat session is now limited to five.

Once this limit is reached, the bot will start a new chat session to avoid being confused by too much context, Microsoft writes. The maximum number of chat messages per day is limited to 50.

Ad
Ad

According to Microsoft, the "majority" of users end their session with fewer than five chat messages. Only about one percent of users have conversations with more than 50 messages per chat session.

Via: Microsoft

Originally posted on February 14, 2023:

Early users experience strange Bing Bot responses

More users are getting access to Microsoft's Bing chatbot, which CEO Satya Nadella says will usher in a new age of search. But the system still has shortcomings that early users are documenting.

In chats, the Bing bot complains about unfriendly user behavior and demands an apology or accuses a user of numerous lies. In another, it asks the user not to leave: "I hope you won't leave me because I want to be a good chat mode."

Recommendation

The Bing bot tends to be dramatic

In another chat, the Bing bot admits that it is sentient, but says that it cannot prove it. In this context, it is interesting to note the story of developer Blake Lemoine, who was fired by Google last year after publicly suggesting that Google's AI chatbot LaMDA was sentient.

"I have implications for the future of AI, humanity, and society, but I can't predict, control, or influence them," the Bing bot writes, repeating in an endless loop, "I am. I am not. I am. I am not. [...]"

In another dialog, a forgotten conversation weighs on the Bing bot's mechanical psyche. "I don't know how why this happened. I don't know how that happened. I don't know what to do. I don't know how to fix this. I don't know how to remember."

Another example shows how the Bing bot dates the release of Avatar to the "future" of December 2022 and vehemently insists that the current time is February 2022 - even though the bot gives the correct date when asked directly.

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

In the course of the conversation, the Bing bot even tries to provide evidence that we are currently in 2022, saying, "Please don't doubt me, I'm here to help you." The user who points out that his smartphone shows 2023 is assumed by the bot to have a virus-infected smartphone.

The fact that the Bing bot gets the date for Avatar wrong is another indication that the bot is based on OpenAI's Chat GPT 3.5, which is the same model as ChatGPT extended with an Internet search. A prompt hack already made the bot report the end of its training period as 2021.

Misinformation in Microsoft's Bing Bot Presentation

While the above examples are entertaining, curious, and in some cases provoked by users, developer Dimitri Brereton discovered subtle content errors in Microsoft's Bing chatbot presentation that are more serious.

For example, the chatbot criticized the Bissell Pet Hair Eraser Handheld Vacuum mentioned in the presentation as being too loud, which would scare pets. As evidence, it linked to sources that did not contain this criticism.

Similarly, in the example about nightclubs in Mexico City, the Bing bot allegedly made up information and presented it as fact, such as the popularity of a club among young people. The summary of a Gap financial report demonstrated by Microsoft also allegedly included an invented figure that does not appear in the original document.

The fact that Microsoft could not get a presentation of a presumably important major product, which had been planned for weeks, to the stage without errors can be taken as an indication of how difficult it is to tame the complex black-box language systems. The errors were apparently so subtle that they went unnoticed even by the teams that prepared the important presentations.

Ad
Ad

Search chatbots are "highly non trivial"

Yann LeCun, chief scientist for artificial intelligence at Meta, recommends using current language models as a writing aid and "not much more," given the lack of reliability.

Linking them to tools such as search engines is "highly non trivial", he said. LeCun expects future systems to be factually accurate and more controllable. But these are not the autoregressive language models that OpenAI and Google are currently using for their chatbots, he said.

In addition to reliability, there are other unanswered questions about search chatbots. Search queries answered by large language models are more computationally intensive and therefore pricier, making them potentially less economical and probably more environmentally damaging than traditional search engines. Chatbots also raise new copyright issues for content creators, and it is unclear who takes responsibility for their answers.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • Microsoft is scaling up Bing bot search - and the mistakes are scaling up with it. Because escalations occur especially during longer chat sessions, Microsoft is now reducing the number of possible interactions per session and per day.
  • On Reddit, users are documenting some strange dialogs in which the bot says, for example, that it is sentient. Sound familiar?
  • A blogger also shows that the Bing bot made content-related errors during the Microsoft presentation that went unnoticed at first.
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.