The new Bing AI ChatGPT Bot will be limited to five responses per chat
As regular TechRadar readers know, the AI-powered chatbot improvements recently added to Bing haven't launched smoothly — and Microsoft is now making some changes to improve the user experience.
In a (opens in a new tab) blog post (via Edge (opens in a new tab)), Microsoft says the tweaks should "help focus conversations": Bing's AI portion will be limited to 50 chats (question and answer) per Today, five responses to each chat.
This is coming: Microsoft executives have previously stated that they are looking at ways to eliminate some of the strange behavior that early testers of the AI bot service have noticed.
Put to the test
These early testers were tested with great difficulty: they managed to get the bot, based on an upgraded version of OpenAI's ChatGPT engine, to give inaccurate answers, get angry, and even question the very nature of its existence.
Allowing your search engine to go through an existential crisis when all you were searching for is a list of the best phones is not ideal. Microsoft says that very long chat sessions confuse the AI and that the "vast majority" of queries can be answered in 5 answers.
The AI add-on for Bing isn't available to everyone yet, but Microsoft says it's working its way through the queue. If you plan to try out the new job, remember to keep your interactions short and to the point.
Analysis: Don't believe the hype just yet
Despite the early issues, there's clearly a lot of potential in the AI-powered search tools that Microsoft and Google are developing. Whether you're looking for party game ideas or places to visit, they can provide fast, informed results—and you won't have to scroll through pages of links to find them.
At the same time, it is clear that there is still a lot of work to be done. Language Large Models (LLMs) like ChatGPT and Microsoft's version of it don't really "think" that way. They're like supercharged autocorrect engines, predicting which words should come in a sequence to give a coherent, relevant answer to what they're being asked.
Plus, there's the problem of sourcing — if people start relying on AI to tell them the best notebooks and cause human writers to break work, these chatbots won't have the data they need to get their answers. Like traditional search engines, they still rely heavily on curated content from real people.
Of course, we took the opportunity to ask the native ChatGPT why long interactions confuse LLMs: apparently, it can cause AI models to be "too focused on conversational details" and prevent them from "generalizing to other contexts or topics", leading to duplicate behavior and responses recurring or irrelevant.