AI & The Henry Higgins Effect
Why we put such trust in Artificial Intelligence at the expense of our own.
THE FLOWER GIRL Ah — ah — ah — ow — ow — oo!
THE NOTE TAKER [whipping out his book] Heavens! what a sound! Ah — ah — ah — ow — ow — ow — oo!
THE FLOWER GIRL Garn!
THE NOTE TAKER. You see this creature with her kerbstone English: the English that will keep her in the gutter to the end of her days. Well, sir, in three months I could pass that girl off as a duchess at an ambassador’s garden party.
In this little scene from Pygmalion by George Bernard Shaw, we first glimpse Henry Higgins (here listed as The Note Taker) as he observes Eliza or Liza Doolittle (The Flower Girl). In that early scene, audiences can laugh at The Flower Girl’s bizarre language, just as we’re swept up in Higgins’ commanding words, his rhetorical force. The girl seems incapable of producing recognizable English, and the man creates symphonies of syllables. The Flower Girl, who we later learn to be Eliza Doolittle, would become his test subject. And so begins the business of George Bernard Shaw’s provocative fable.
ELIZA would become the name of the first chabot, created in 1964-66 by Joseph Weizenbuam who was drawing the name from Pygmalion, likely also influenced by the 1964 theatrical release of My Fair Lady, the movie musical adaptation. His choice of names ties his project to the Pygmalion’s hypothesis that language makes the person, or perhaps that language is a source of socio-economic prejudice. As ELIZA played out the script of a Rogerian psychotherapist, people began to talk with his chatbot system as though it were a person.
Granting a piece of software human consciousness has become known as the Eliza effect, based on the persuasive performance of Weizenbaum’s system. Much like Eliza Doolittle of Shaw’s play, the ELIZA system was convincing, could “pass” even. Surely, OpenAI is aware of this ELIZA effect, as it promotes its ChatGPT, whose omni incarnation presents a more sarcastic persona through voice interaction, that can sound not unlike Scarlet Johnansson in Her. However, the term ELIZA effect, seems to miss the point, and so I give you The Henry Higgins effect, or the Higgins effect, for short.
(You can read more about and talk to ELIZA on the website for our forthcoming book about the famous chatbot system.)
The Henry Higgins Effect
The Higgins effect names our tendency to award intelligence to beings or systems that perform signs of intelligence that we deem superior to our own, to subordinate our linguistic production to the machine at the cost of our individual and authentic thought and expression. Large language models are to Higgins as humans are to Eliza, who exchange their own authentic language (and thought) for its perceived superior language as a means of social improvement. In the words of OpenAI founder Sam Altman, the company is designing ChatGPT to fill the role of “a super-competent colleague that knows absolutely everything about my whole life, every email, every conversation I’ve ever had, but doesn’t feel like an extension.” Like the character in Pygmalion, systems that speak in standard white English and mansplain with a certain rhetorical appearance of authority receive more credit than they are due. (As a cis-gender white male professor of writing, I know this only too well.)
It’s easy to forget in the Eliza effect that the system did not present itself as a young flower girl but as DOCTOR, a psychotherapist. What prompted users to talk more is not discussing their emotions with a friend but confessing them to an authority. Now, obviously, we do not have records of what people thought when they were chatting with ELIZA. I am just reminding us of the conversational situation: it was not a human talking with ELIZA but a human talking with DOCTOR. To a DOCTOR they gave intelligence, not to ELIZA.
What the Eliza effect misses is the dangerous tendency of humans who are so eager to offload their thought process on a Large Language Models that they assume can think because the models can produce flawless English (and many other languages), which of course here means white standard English with business-class proficiency. Though used to generate everything from wedding toasts to student essays, ChatGPT uses the rhetorical force of a machine learning mansplainer. What the ELIZA effect misses is that in Pygmalion, Shaw was warning us about Henry Higgins.
What could go wrong if we don’t call attention to the limits of these bloviating black boxed algorithms? Lawyers could end up with fake citation in their briefs. Defendents could end up with even more inequitable bail rates. Politicians could be subject to machine generated defamation. Although as a writing teacher, I worry more about “the average” passable idea becoming the end of my students’ thinking.
Whenever you ask ChatGPT to produce content outside of your expertise, and in place of your expressions of thought, you are likely to fall prey to the Henry Higgins effect.
The Higgins Effect, a choice
Why should you fight the Higgins effect? Though Higgins makes the wager, it is Eliza who makes the bargain that in exchange for taking the professor’s chauvinistic abuse, she will receive the linguistic means of improving her socio-economic status. She is like the college applicant who uses an LLM to generate her personals statement, or perhaps the graduate who uses it to compose their cover letter. What gets lost is not just her voice but also, for much of the story, her independence, continuing her effacement until she stands gracefully, subdued by Higgins side in a high society party, passing indeed for a duchess.
What is so bad about acquiring the means of social mobility? Perhaps it is worth remembering that the cost is her individuality, her identity, and freedom. Remember, everyone is data to Henry Higgins. Everyone is a subject to and material for his scientific systems. (Remind you of anyone designing these systems?) He turns their utterances into something valuable to him, so that he can reproduce the language without attachment to its location or history or person. We are in danger of losing not just the voices of writers but also their thoughts, pre-empted by the imperious and authoritatively normal language of Higgins. Eliza’s gambit, her trade, has left her alienated from her former registers of speech, with limited ability to code switch, her cockney slang emerging only when she is pushed to the emotional brink, as it is language she learns to see as impoverished.
Questions: Setting aside the musical’s “happily ever after,” how did the transaction pay off ? Did Eliza Doolittle become more herself when using the language Higgins provided? What’s at stake if we put too much credence in these LLMs?
Well, let’s return to Pygmalion in its musical form, My Fair Lady:
Higgins: What’ll become of her if we leave her in the gutter, Mrs. Pearce?
Mrs. Pearce: That’s her own business, not yours, Mr. Higgins.
Higgins: When I’m done, we’ll throw her back.
If we don’t want to end up with this fate, we might want to do more than throw some slippers at Henry Higgins. Sure, we can have our fun with it, but… If we offload our thinking to its stochastic textual generation, as just-in-time good-enough prose, we will end up reproducing perfectly average ideas (or perhaps randomized remixed of them), repeating accepted assumptions, re-iterating and reinforcing the status quo in a steady babble of passable paragraphs, washing unique voices and ideas like Eliza’s into the gutter. Not convinced? Just you wait.
The Eliza effect may have named our first 60 years of experience with AI, but the Higgins effect describes the current bargain we make we trade our voices for something more suited in pursuit of the status of fluent and fluid language.
[To read more about ELIZA, come to our new website: Finding ELIZA, where our team of researchers is exploring the history and influence of this famous forerunner of modern-day conversational programs. We’ll publish our findings there and in our forthcoming book: Please, Go On.]