Microsoft recently opened up early access to Bing’s new AI feature. The search site now has an integrated artificial intelligence, built on the back of popular AI ChatGPT, that can answer your queries.
The problem, though, is that the AI is chronically Bing.
True to Bing’s nature, users with early access are reporting some particularly strange responses from the AI. I say true to nature, because the search engine has never been tremendously good at… Searching.
Over on the Bing Subreddit, which is a much more popular place than I thought it was going to be, users are sharing their chatlogs with the AI that have gone completely off the rails.
Bing tells me its initial prompt, and then tells me it’s not listening to me anymore from bing
Meanwhile, similar responses are being reported over on Twitter. In this below chat, the chatbot completely mistakes the year for 2022, and argues about Avatar: The Way of Water being available for streaming.
My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user”
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023
If you’d like a more broad snapshot of the responses Bing has been dishing up to users, here are some highlights.
Bing subreddit has quite a few examples of new Bing chat going out of control.
Open ended chat in search might prove to be a bad idea at this time!
Captured here as a reminder that there was a time when a major search engine showed this in its results. pic.twitter.com/LiE2HJCV2z
— Vlad (@vladquant) February 13, 2023
A running joke on the Subreddit is referring to ‘Bing’ as ‘Sydney’, because the AI will often refer to itself as Sydney. In an article published by The Verge, it turned out that ‘Sydney’ was the internal codename of another chatbot experience being developed by the company.
It also has a pretty funny grudge against Google, which is also releasing a search engine AI named ‘Bard’. It’ll be interesting to see if Google, arguably the dominant search engine when talking about the two, holds the same grudge against Bing.
Anyway, it definitely looks like Bing’s ChatGPT-powered relaunch has gone ahead not quite as planned (not as bad as Tay did, at least).
The good folk at Ars Technica reported today that the AI lost its mind when fed one of their articles (side note, but that article revealed that the chatbot is susceptible to prompt-injection attacks), and at the AI’s launch event, the bot generated results that were incorrect about pet hair vacuums and Gap Clothing’s Q3 2022 financial report.
Let’s hope that its engineers can soon fix its bugs, although, it is entertaining watching it perform like this.
I’m currently on the waitlist for Bing’s AI experience but will give it a go (along with Google’s upcoming Bard AI) once it’s available to me, just like I did with Google’s AI Test Kitchen app.