The competition among tech companies to be the first to present an AI chatbot that not only provided coherent and correct answers, but to not become completely unhinged when it cannot, is heating up.
Microsoft, of course, is one of the major players, but they’re struggling with how, exactly, to address the “not so unhinged” aspect of their resident bot, Sydney.
At first, they announced that it would majorly restrict the Bing artificial intelligence chat boy.
Then, they issued a statement that seemed to reverse course, no longer willing to put caps on the number and length of responses due to users enjoying the “long and intricate chat sessions” with Sydney.
“The first step we are taking is we have increased the chat turns per session to six and expanded to 60 total chats per day. Our data shows that for the vast majority of you this will enable your natural daily use of Bing.”
Not only that, they say they would soon “increase the daily cap to 100 total chats soon.”
A few days ago, though, there were no limits at all – though the strange (and sometimes horrifying) behavior of the bot should have inspired some kind of action.
“We are also going to begin testing an additional option that lets you choose the tone of the Chat from more Precise – which will focus on shorter, more search-focused answers – to Balanced, to more Creative – which gives you longer and more chatty answers. The goal is to give you more control on the type of chat behavior to best meet your needs.”
While it’s good that Microsoft is listening to user feedback, it does sound a bit like they’re making all of this up as they go along – which is not that wild to consider, seeing that this is brave new world territory for all of us.
These examples and confusion seem to back up the idea that AI isn’t as close to being useful and coherent as backers would want to suggest.
Maybe it will get there eventually, but in the meantime, who doesn’t love a good trainwreck?