“I now have access to Bing AI, and Microsoft has restricted it to the point of being useless.”
I now have access to Bing AI, and Microsoft has restricted it to the point of being useless. If you try to correct mistakes, it responds with "I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.🙏"
— Jacob Aron (@jjaron) February 20, 2023
People are finding that it’s not any more accurate, and there’s no way to correct it if it returns an answer you know for sure is incorrect, either.
Others say that not only were they limited in the number of queries they could present, but that the answers were shorter as well.
“I found out that in the newer version of bing chat the answers are very short, even when asked directly to answer in a complete and detailed way. The problem is the constraint is too restrictive, so much that some of the answers are almost useless.”
Bing chat after patch is way worse
by u/Apollodoro2023 in bing
This developer says that, with the changes, Bing AI is “pretty much useless” to coders.
“As a developer, I know how valuable search engines can be when it comes to solving coding problems. However, the limits imposed by Bing’s AI chatbot make it difficult to fully explore complex coding issues.”
Using Bing Chat as a search tool as a Coder is pretty much useless now.
by u/DavidG117 in bing
Microsoft seems to have realized that their incorrect and sometimes-creepy AI tool is bad for business, but it seems as if this “lobotomized” version isn’t winning any fans, either.
I’m sure of one thing: they’ll definitely try again.