
71% of new Bing users have given a thumbs up to a response they received from Bing’s AI chatbot
But it turns out that while ChatGPT is conversational AI, some are finding it akin to chatting with the great late Don Rickles (Google him). On social media, those who have tested the new Bing have reported being insulted by the new AI feature. Others report being manipulated and lied to by the new AI feature. Still, these issues seem to be in the minority as Microsoft notes that 71% of those who received a response to a query asked of ChatGPT gave the answer a thumbs up.

If you’re on the new Bing waitlist, you will see this notice at Bing.com
Microsoft says, “The only way to improve a product like this, where the user experience is so much different than anything anyone has seen before, is to have people like you using the product and doing exactly what you all are doing. We know we must build this in the open with the community; this can’t be done solely in the lab. Your feedback about what you’re finding valuable and what you aren’t, and what your preferences are for how the product should behave, are so critical at this nascent stage of development.”
Since AI chatbots tend to give false answers, a problem known in the AI world as “hallucinations,” don’t be surprised if some responses are wrong. To get more helpful, focused, and accurate answers from Bing, Microsoft says users need to limit the number of questions asked in one long, extended session, to fewer than 15. Otherwise, the company says, “Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.”
Microsoft points out that very long chat sessions can confuse the chatbot leading it not to understand the questions you are asking. The software giant is thinking about adding a toggle that would allow the user to “refresh the context” or “start from scratch.”
Some new Bing users have had sessions lasting as long as two hours
Additionally, the model tries to respond to queries in the same tone in which the query is asked. This can lead to responses from Bing that are not in the style that Microsoft intended. And Microsoft wants those who encountered bugs such as broken links, incorrect formatting, and slow loading, to know that these bugs have been exterminated thanks to daily releases. Weekly releases will fix even more issues.
Lastly, the company says, “We are thankful for all the feedback you are providing. We are committed to daily improvement and giving you the absolute best search/answer/chat/create experience possible. We intend to provide regular updates on the changes and progress we are making. Please keep the feedback coming.”
The testing continues.