The stories about how Microsoft basically created a monster in their AI, which would give super problematic answers to queries, among other things, have made the rounds.
It would seen Microsoft is listening to the feedback, which ranges from amused to horrified, as they’ve “lobotomized” the program.
The move comes after the tool, codenamed “Sydney,” went on a tirade and dumped odd stories into news feeds everywhere. This was not, obviously, the sort of publicity Microsoft wanted for its tool, so they decided to give it firm limits instead.
Now, she can only reply 50 times per day and engage in 5 chats per session, which they hope will cut down on the bizarre and sometimes disturbing responses.
That said, Microsoft blamed users for the change and not the faultiness of their tool.
“Our data has shown that the vast majority of you find the answers you’re looking for within five turns and that only roughly one percent of chat conversations have 50+ messages. After a chat session hits five turns, you will be prompted to start a new topic.”
“I now have access to Bing AI, and Microsoft has restricted it to the point of being useless.”
I now have access to Bing AI, and Microsoft has restricted it to the point of being useless. If you try to correct mistakes, it responds with "I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.🙏"
— Jacob Aron (@jjaron) February 20, 2023
People are finding that it’s not any more accurate, and there’s no way to correct it if it returns an answer you know for sure is incorrect, either.
Others say that not only were they limited in the number of queries they could present, but that the answers were shorter as well.
“I found out that in the newer version of bing chat the answers are very short, even when asked directly to answer in a complete and detailed way. The problem is the constraint is too restrictive, so much that some of the answers are almost useless.”
This developer says that, with the changes, Bing AI is “pretty much useless” to coders.
“As a developer, I know how valuable search engines can be when it comes to solving coding problems. However, the limits imposed by Bing’s AI chatbot make it difficult to fully explore complex coding issues.”
Microsoft seems to have realized that their incorrect and sometimes-creepy AI tool is bad for business, but it seems as if this “lobotomized” version isn’t winning any fans, either.
I’m sure of one thing: they’ll definitely try again.