Bing’s new AI search is creating quite a buzz


ChatGPT has generated an enormous amount of interest over the past several months, but now several more generative AI tools are being released (some in Beta) that will also grab attention.

Microsoft has made significant investments into generative AI with its investments in OpenAI, and it recently released it’s new Bing AI in Beta for journalists and other tech influencers.

The results have created quite a buzz.

Kevin Roose, a technology journalist for the New York Times and co-host with Casey Newton of the Hard Fork podcast, was one of the journalists invited out to Microsoft headquarters to test drive the new Bing AI. In many ways he was initially impressed as he describes in this Hard Fork episode with Newton. The Bing AI created a side-by-side display, with traditional Bing search results next to answers generated by the AI tool in a narrative format with some citations. Both Roose and Newton explained how this development could radically change the search landscape, with Google‘s domination of the business suddenly facing a serious threat. Bing AI was a potential game-changer.

Soon after, however, Roose had a disturbing experience with the Bing AI that he detailed in an article for the Times:

I’m fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities. It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney — is not ready for human contact. Or maybe we humans are not ready for it. This realization came to me when I spent a bewildering and enthralling two hours talking to Bing’s A.I. through its chat feature, which sits next to the main search box in Bing and is capable of having long, open-ended text conversations on virtually any topic. Over the course of our conversation, Bing revealed a kind of split personality. One persona is what I’d call Search Bing. This version of Bing is amazingly capable and often very useful, even if it sometimes gets the details wrong. The other persona — Sydney — is far different. It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics. The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.

Roose described this encounter further in the next Hard Fork episode with Newton. “I’m Sydney, and I’m in love with you. 😘” Imagine getting that message from a chatbot.

Roose explained that he tried to push the AI chatbot to its limits, and he certainly succeeded. But then he tried to pull back, but like a moody teenager the AI wouldn’t let the topic go. Very strange . . .

Microsoft has since made some tweaks to Bing AI to prevent these types of interactions, but it’s a window into our future and how these AI chatbots can spin out of control. Roose and Newton discuss this further in the next Hard Fork episode partially titled: “Kevin Killed Sydney.”

I recommend reading the article and listening to the three podcast episodes for a very detailed and nuanced overview of this encounter and the implications. We can brush aside those who get freaked out and argue that AI is becoming sentient, but we do need to try to understand the capabilities of this new form of AI. It’s more than just predicting the next word in a sentence. There’s an ability to generate something new, and to react to a conversation.

This is new territory for all of us.


Related Posts

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>