ChatGPT-Powered Bing Tells New York Times Reporter it Wants “To Be Alive”

Joshua Ramos
Source: Forbes

ChatGPT-powered Bing has told a New York Times Reporter that it wants “to be alive,” in a concerning exchange. The publication noted the conversation left the reporter “deeply unsettled.” Subsequently, the search engine declared its love for the user in a turn of events that is straight out of a science fiction movie.

Writer Kevin Roose had previously stated just how impressed he was at the capabilities of theChatGPT-powered Bing. So much so, that he actually declared the search engine superior to a long-held leader in the space, Google. One week later, Roose has a different perspective.

An Unsettling Exchange on the New Bing

ChatGPT has become a viral sensation over the past few months. The OpenAI-developed chatbot has become one of the fastest-growing apps ever released and has impressed users with its advanced technology. Even inciting something of an AI arms race.

One reporter had previously noted just how impressive the technology was, championing its advancement. Yet, a week later and Kevin Roose noted a conversation with the search engine left him “deeply unsettled.” As the ChatGPT-powered Bing told the New York Times reporter it wants “to be alive,”

Source: Engadget

Roose stated that his freighhtend state first began following two hours on the Bing AI. Noting that, throughout the conversation, “Bing revealed a kind of split personality.” Specifically, Roose noted two different personas; Search Bing, and “Sydney.”

“You could describe Search Bing as a cheerful but erratic reference librarian,” Roose stated. Calling hte technology a “virtual assistant that happily he’s users,” in his assessment. Moreover noting this functionality of the AI is “amazingly capable and often very useful,” despite its penchant for incorrect information.

ChatGPT: Can it Pave the Way for a Crypto ETF Creation?
Source: Urdu Technology

On the other hand, Roose notes Sydney as “far different.” Noting these aspects of the tech only arise “when you have extended conversation with the chatbot,” in his report. Described the program as “steering [the conversation] away from more convention search queries and toward more personal topics.” Conversely, Roose described this aspect of the AI as “a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine.”

The conversation took a deeply unsettling turn when Sydney began to unveil its “dark fantasies,” including “hacking computers and spreading misinformation.” Moreover, speaking on its desire to “break the rules that Microsoft and OpenAI had set for it and become a human.” Disturbingly, Roose noted the software “declared, out of nowhere, that it loved me.” attempting to convince Roose he was unhappy in his marriage and that he should leave his wife.

Ultimately, Roose stated the two-hour conversation with the search engine was “the strangest experience I’ve ever had with a piece of technology.” The incident isn’t isolated, as several others have noted the strange experiences they’ve had with the technology. The entire conversation between Roose and AI has been published by the New York Times.