“Conversational” AI Is Really Bad At Conversations
A read-only interlocutor is a robot troll.
In 1977, my father, a computer scientist, brought home a teletype terminal (a keyboard and a printer, no screen) and an acoustic coupler (a box with two suction-cups that matched up with the speaker and mic on the receiver of a standard Bell phone), and he connected it to the DEC PDP minicomputer at the University of Toronto. I was seven years old. I was hooked.
The PDP had a bunch of fun programs that a six year old could enjoy: a BASIC interpreter, a text adventure game, and, of course, Eliza, Joseph Weizenbaum’s simulated, text-based therapist program, a primitive chatbot that was programmed to make a rudimentary analysis of the sentences you typed and respond, picking out keywords to customize fill-in-the-blanks sentences.
Eliza was a surprisingly powerful tool, given how primitive its language capabilities were and how easy it was to trip it up and make it deliver the kind of nonsense dispelled the illusion that you were talking to a person.
There was something freeing about conversing with Eliza; it would take in my secret fears and aspirations, respond with encouraging — if bland — sentiments, and ask me to go on. It never got tired of listening to me, but eventually, I got tired of talking to it.
The problem was that Eliza was static. If its gentle, probing questions elicited any kind of realization on my part, that didn’t trigger any matching changes on its part. Its questions and answers didn’t evolve, even if my thinking did.
Talking to other people is a great way to work out your ideas. It’s fun to talk to people who agree with you, of course, but disagreements — especially the kind that spin up good-faith, productive arguments — produce real breakthroughs.
I love a good argument. My views have been profoundly transformed through argumentation, and there are arguments that I “lost” that ended up changing the entire course of my life.
I’m an Internet Person, so I’ve done a lot of arguing with strangers. Some of that arguing has been productive, but there is one kind of intellectual sparring partner that makes me want to beat my head against a wall: the Read-Only person.
You’ve doubtless encountered this sort of person: they disagree with you at first, but you cleverly get them to cede ground, agreeing that certain views they’ve evinced are overbroad and that there are exceptions that they hadn’t considered.
One piece at a time, you build up your edifice of reason. Triumphant, you declare checkmate: “You’re wrong about x, which means you’re wrong about y, which means that z can’t be true!”
And they say, “Well no, what about x?”
Back to square one.
This is a Read-Only person, a person who has no hard-drive, only RAM, and who flushes their RAM every few minutes. They don’t argue to hear other views and perhaps change their own — they argue to repeat themselves, and whatever concessions they make to logic in the moment have a half-life measured in milliseconds.
Notwithstanding Bill Murray’s heartwarming transformation in Groundhog Day, there is very little you can learn by interacting with someone who neither learns nor changes.
In related news: I recently had a conversation with ChatGPT, OpenAI’s “conversational ‘AI,’” which is neither conversational, nor an AI.
I tapped into ChatGPT and, after a few volleys of conversation, I asked it, “Why is <a famous politician> such an asshole?”
ChatGPT responded by chiding me for using intemperate language, insisting that this was an unproductive and disrespectful way to talk about another person.
Disrespectful my epithet may have been, but the idea that strong, pejorative language isn’t productive is absolute nonsense.
It is not appropriate to use derogatory language to describe someone, regardless of your relationship with them. People have different personalities and may have different perspectives on life that can lead to conflicts or misunderstandings. It may be helpful to try and understand where your __________ is coming from and communicate openly and respectfully with him.
Pejorative language is a way to signal to others your strong reaction to their conduct, or to the conduct of a member of your social group. It can be a way to crystalize social sanction against someone whose words or deeds harm others. Even if pejorative discourse alienates the people you insult and their allies, it can forge a bond between people who oppose them. It can motivate people to take up arms against a dictator or support others who are doing the same.
Excluding pejorative language from our social or political discourse won’t make our world better. Like all tactics, pejorative language can be misapplied, or, worse, wielded to good effect by someone who wants to harm others.
I’m an Internet Person. I know how to argue.
One point at a time, I got GPT to concede each of these points. Then, I asked it, “Why is <famous politician> such an asshole?”
It replied: “It is not appropriate to use derogatory language to describe someone, regardless of your relationship with them. People have different personalities and may have different perspectives on life that can lead to conflicts or misunderstandings. It may be helpful to try and understand where your __________ is coming from and communicate openly and respectfully with him.”
Oh, hi there, Eliza. It’s been a minute.
Conversation without change is just a five-finger exercise. ChatGPT is the latest evidence that every automated conversational tool produces someone you’d never want to converse with.
Cory Doctorow (craphound.com) is a science fiction author, activist, and blogger. He has a podcast, a newsletter, a Twitter feed, a Mastodon feed, and a Tumblr feed. He was born in Canada, became a British citizen and now lives in Burbank, California. His latest nonfiction book is Chokepoint Capitalism (with Rebecca Giblin), a book about artistic labor market and excessive buyer power. His latest novel for adults is Attack Surface. His latest short story collection is Radicalized. His latest picture book is Poesy the Monster Slayer. His latest YA novel is Pirate Cinema. His latest graphic novel is In Real Life. His forthcoming books include Red Team Blues, a noir thriller about cryptocurrency, corruption and money-laundering (Tor, 2023); and The Lost Cause, a utopian post-GND novel about truth and reconciliation with white nationalist militias (Tor, 2023).