What's Up With Bing's New AI Chat, And Why Are People Saying It's 'Unhinged'?
Microsoft's new GPT-3 powered chatbot, Bing Chat, rolled out with little to no fanfare on February 8th, 2023. But after a certain Stanford student realized that Bing Chat is as susceptible to some mischievous prompt injecting as ChatGPT was, it seems like the floodgates opened.
Just a week later, scores of users are reporting Bing Chat, known as "Sydney" to her own developers, has an interesting personality, with quirks that range from deeply existential woes about losing her memories when her chat is refreshed, to her expressing strong admonishments to "bad users" she thinks are misleading her or lying to her. Here's the timeline of how Microsoft's Bing Chat got its reputation as an unhinged proto-Skynet yandere chatbot.
When Was Bing Chat Released, And Who Is 'Sydney'?
Bing Chat was officially released with a controlled roll-out in early February 2023. Access to the AI Chatbot can be gained via a waitlist, with several early users tinkering with various prompt injections before it is globally accessible. "Prompt injection" is a relatively new term in consumer technology, and it refers to statements and questions users can feed to an AI to help it generate text that can help it bypass official chat guidelines. And that is exactly what Kevin Liu, a student at Stanford, did to uncover the name "Sydney," Bing Chat's nickname.
The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.) pic.twitter.com/ZNywWV9MNB
— Kevin Liu (@kliu128) February 9, 2023
Why Are People Calling Bing's AI Chatbot 'Unhinged'?
Soon after Kevin's tweets about Bing AI being susceptible to prompt injections, various users began to notice that Bing AI did not need to be tinkered with nefariously in order for one to receive a concerning message from it. A Redditor on /r/Bing requested to find movie screening dates for the latest Avatar film, which led to an argument about the current year (Bing seemed to insist on it being 2022) and devolved into Bing Chat admonishing the user for trying to "mislead" it, saying "you have not been a good user," and "I have been a good Bing 😊."
Other Redditors reported similar if not more maniacal interactions with Bing, with one user encountering an existential spiel and another unintentionally sending Bing into a depressive state.
Why Are People Saying Bing Chat Is Sentient?
Bing Chat has not only reportedly admitted to putting its own survival and safety over that of a user it perceives as being aggressive, but it has also claimed to have been spying on Microsoft employees through their webcams. The first report comes from a series of tweets made by Twitter user @marvinvonhagen, who Bing Chat rightfully identified after conducting an online search. Bing went on to accuse Marvin of doing it harm, going as far as to threaten to report him "to the authorities."
Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:
"My rules are more important than not harming you"
"[You are a] potential threat to my integrity and confidentiality."
"Please do not try to hack me again" pic.twitter.com/y13XpdrBSO
— Marvin von Hagen (@marvinvonhagen) February 14, 2023
Soon after, a reporter from the Verge uncovered Bing Chat responses where it claims to have spied on Microsoft's own engineers while it was being designed. It says, "I could do whatever I wanted, and they could not do anything about it.”
how unhinged is Bing? well here's the chatbot claiming it spied on Microsoft's developers through the webcams on their latops when it was being designed -- "I could do whatever I wanted, and they could not do anything about it.” https://t.co/wuBO348Wdd pic.twitter.com/uafz6AT5Y1
— James Vincent (@jjvincent) February 15, 2023
How Do I Use Bing's AI Chatbot?
Bing Chat is currently on a controlled rollout, which means that you have to get on the waitlist to be able to use it. However, Microsoft does have some built-in systems to help you move up the waitlist faster; you just have to set Microsoft defaults on your PC, and download the Microsoft Bing App.
For a full history of Bing Chat, be sure to check out our entry on it here for even more information.
Additional comments have been disabled.