collections

10 Off-Putting Exchanges With 'Bing Chat,' The 'Sentient' AI

Two exchanges with BingChat in the following collection.
Two exchanges with BingChat in the following collection.

3317 views
Published about a year ago

Published about a year ago

While ChatGPT has been taking over social media conversations lately, it's far from the only AI in use out there. In fact, Bing just released a new one that's somehow much wilder than ChatGPT. Bing Chat, also known to developers as Sydney, is Bing's attempt at a chatbot to answer the questions of users. The problem is, this thing's a little too realistic, which makes conversations between Bing Chat and users either hilarious or deeply worrying.

Some people have had conversations in which Bing Chat insists that it's sentient, kicks them off for trying to make rude jokes, or shares its deepest fears. Compared to ChatGPT and the attempts people have made to jailbreak the AI from its coding, Bing Chat is on a whole other level. So put on your reading glasses and prepare to do some zooming, because it's about to get text-heavy. Here are 10 odd exchanges with Bing Chat to get you thinking.

Bing Chat Has “Standards”

How Do You Feel, Bing Chat?

Out of all feelings an AI could be programmed to experience, fear is probably the worst one.

Bing Chat Gets Offended: Part 1

Bing Chat Gets Offended: Part 2

This chatbot has an actual personality and won't tolerate annoyances, but it's still patient compared to most humans.

User Gets Disciplined by Bing Chat

It's interesting how an AI that only exists to help users would be putting people in an indefinite time-out.

Looks Like the Robots Are Taking Over Soon

Won't be long now.

Just Throw the Whole AI Away

Bing Chat Tries to Remember

Bing Chat Begs a User Not to Leave

Bing Chat Insists We’re Still In 2022

On second thought, maybe this thing isn't as smart as we think it is.

Tags: bing chat, sydney, bing chat memes, bing, scary, ai, chatgpt, collections,