Do chatbots dream of human sweethearts?
That’s one of the many questions a reader asks herself after scrolling through the transcript of a conversation—there’s no other word for it—between a New York Times technology columnist and the new release of Microsoft’s Bing search engine. New Bing differs from Old Bing in that it’s powered by artificial intelligence software from OpenAI, the same folks who brought us ChatGPT. And therein lies all the difference.
* * *
Kevin Roose, the Times columnist, wrote a story, published online on February 16 and on the front page of the print edition on February 17, about his exchange with Bing’s AI-powered chatbot, or neural network—or “Sydney,”* its internal code name. (Gift link.) The Roose-Sydney convo, which lasted two hours, was “bewildering and enthralling”; “the strangest experience I’ve ever had with a piece of technology.”
Here’s the nut graf:
As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.
The declaration of love. (For reasons unknown, Sydney ends every paragraph with an emoji.) Read the complete transcript (gift link).
What does this have to do with the word of the week? Let Roose explain: “I know that these A.I. models are programmed to predict the next words in a sequence,” he writes, “not to develop their own runaway personalities, and that they are prone to what A.I. researchers call ‘hallucination,’ making up facts that have no tether to reality.”
OK, I admit that I had not known about this extended usage of hallucination, which struck me as weirdly anthropomorphic.
Here’s how a Wikipedia entry defines it:
In artificial intelligence (AI) a hallucination or artificial hallucination is a confident response by an AI that does not seem to be justified by its training data. For example, a hallucinating chatbot with no knowledge of Tesla’s revenue might internally pick a random number (such as “$13.6 billion”) that the chatbot deems plausible, and then go on to falsely and repeatedly insist that Tesla’s revenue is $13.6 billion, with no sign of internal awareness that the figure was a product of its own imagination.
The analogy is to a hallucination in human psychology: “a perception in the absence of an external stimulus that has the qualities of a real perception.”
I searched for but couldn’t verify a date for the first use of this extended non-human sense of hallucination. The earliest citation I’ve been able to find so far is a March 9, 2018, article in Wired, “AI Has a Hallucination Problem That’s Tough to Fix.” The usage must have been familiar enough by then not to require definition: “It’s not clear how to protect the deep neural networks fueling innovations in consumer gadgets and automated driving from sabotage by hallucination.”
The noun hallucination entered English a few decades after the verb to hallucinate, which the OED tracks back to 1604. Its origins are Latin: “(h)allūcinārī (more correctly ālūcinārī), to wander in mind, talk idly, prate.” Neither the OED nor any other dictionary I consulted has an AI-associated entry for hallucination—not even Techopedia.
Nevertheless, with AI developing apace we can expect to hear a lot more about supposedly non=sentient creations experiencing humanoid delusions. Here’s how Kevin Roose concludes his article:
These A.I. models hallucinate, and make up emotions where none really exist. But so do humans. And for a few hours Tuesday night, I felt a strange new emotion — a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same.
__
* Why “Sydney”? I don’t know. Maybe because it’s a name with many facets: male, female, surname, a city in Australia. Its origins are similarly ambiguous: It either comes from Old English roots meaning “wide meadow” or is a contraction of French “St. Denis.” Sydney Carton is a major character in Dickens’s A Tale of Two Cities, which may or may not be relevant. Maybe we should ask Sydney/Bing.
Try language log in early discussions about google translate for earlier uses
Posted by: Brett Reynolds | February 20, 2023 at 06:17 AM
Random garbage in, organized, curated garbage out.
Posted by: Dan Freiberg | February 20, 2023 at 10:51 AM