Microsoft Will Pay People If They Find Bugs In Their Bing AI Products
by Trisha Leigh
There’s been much ado about AI and the way no one can really trust it to tell the truth, or to return helpful responses – and then, of course, there’s the time it does worse than that, outright lying or spouting racist nonsense.
Microsoft thinks they’ve got the bugs finally run off, though, and so they’ll pay anyone who can trigger Bing going forward.
This new “bug bounty program” promises to reward users between $2,000 and $15,000 if they can detect “vulnerabilities” in their Bing AI products.
Presumably they’re saving the big bucks for anyone who can get it to jump the shark completely, ie: being bigoted, racist, rude, or otherwise super embarrassing to claim.
To claim the cash, the issue found must be previously unknown and either “important” or “critical” to security.
This is in response to Microsoft’s well-documented troubles handling Bing’s tendency to go rogue – drawing up a hit list, claiming to work as a spy, and even threatening people, to name a few instances.
They say this new “lobotomized” version (released less than a month later), is fit for public consumption and honestly, they haven’t had any big reported issues thus far.
At least, not too many.
Only time will tell whether or not this bounty will show more cracks in the facade.
And if it doesn’t, maybe we should just assume it’s gotten better at hiding it’s true self…
Sign up to get our BEST stories of the week straight to your inbox.