What Can Lewis Caroll Teach us about GenAI?

Lately I’ve been posting a bit about metaphors, and their use in teaching and critiquing Generative AI products like ChatGPT. As an English teacher first and foremost, I want to find ways to teach with, through, and sometimes against AI that keep students’ voices and educators’ expertise front and centre.

And I’m always looking for new ways to help demystify GenAI. Recently, I stumbled across a way to describe AI in an unusual place: Lewis Caroll’s Through the Looking Glass.

Here’s the full passage:

“You are sad,” the Knight said in an anxious tone: “let me sing you a song to comfort you.”
“Is it very long?” Alice asked, for she had heard a good deal of poetry that day.
“It’s long,” said the Knight, “but it’s very, very beautiful. Everybody that hears me sing it——either it brings the tears into their eyes, or else——”
“Or else what?” said Alice, for the Knight had made a sudden pause.
“Or else it doesn’t, you know. The name of the song is called ‘Haddocks’ Eyes.’”
“Oh, that’s the name of the song, is it?” Alice said, trying to feel interested.
“No, you don’t understand,” the Knight said, looking a little vexed. “That’s what the name is called. The name really is ‘The Aged Aged Man.’”
“Then I ought to have said ‘That’s what the song is called’?” Alice corrected herself.
“No, you oughtn’t: that’s quite another thing! The song is called ‘Ways And Means’: but that’s only what it’s called, you know!”
“Well, what is the song, then? ” said Alice, who was by this time completely bewildered.
“I was coming to that,” the Knight said. “The song really is ‘A-sitting On A Gate’: and the tune’s my own invention.”

Lewis Carroll, Through the Looking Glass, Chapter 8

Let’s break it down.

The White Knight offers a song to comfort Alice – not that’s she’s particularly interested. Like many of our students, she’s already had enough of poetry. But the Knight, like our friendly Large Language Model based chatbot, is indomitably optimistic. You can almost hear the, “Sure, let me help you with that!”

When the Knight tells Alices what the name of the song is called, however, things start to go a bit sideways.

From nonsense, to doubletalk, and to linguistic somersaults that make his stories incredibly hard to translate, Lewis Caroll is well known for his language trickery. This passage is a fine example.

Because the Knight doesn’t immediately tell Alice what the song is, or even what the song is named: there are layers of abstraction happening which Alice has to unpick before she can make any sense of the Knight’s message.

And so it is with Generative Artificial Intelligence.

Understanding Chatbots

Like the Knight, ChatGPT speaks in abstractions. As I wrote in an earlier post, Chatbots don’t make sense; they make words. But even that’s not strictly true. They generate tokens, and convert those tokens back into readable Unicode. They take maths – statistics, probabilities – and derive seemingly meaningful text from the user’s input.

Does it matter? Or, like the Knight, am I just speaking in riddles to deliberately piss people off?

I think it does matter. And I’m not alone. I recently came across an article shared by Kim Flintoff on LinkedIn that puts the problem another way:

[Chatbots] are designed to produce statements that are plausible. As Lilly Ryan puts it, their statements are not facts, they are Fact Shaped. And I would go further. Their summaries are not summaries. They are summary shaped. Their solutions are not solutions. They are solution shaped. And the systems are not people. But they are increasingly people shaped. 

Dr Linda McIver – From Hypnotised to Heretic: Immunising Society Against Misinformation

“Their statements are not facts, they are Fact Shaped.” Or, to put it back into the words of our Alice analogy, the name of the song is called… something shaped like the song itself.

The output of a chatbot is not reality. It is not even a true reflection of reality. It is Reality Shaped, in the same way that the name of the song is different to the song itself (and what the name of the song is called, even further removed).

Take this argument from scholars Bill Cope and Mary Kalantzis:

“Computers can’t mean anything other than zero or one. All they can do is calculate by textual transposition: recorded Unicode > chunked into tokens > binary notation > calculation of the probability of the next token > token > readable Unicode”
Cope, B., and Kalantzis, M., (2024) Literacy in the Time of Artificial Intelligence

And yet, the illusion of meaning is compelling. Much like Alice attempting to unravel the Knight’s riddles, we find ourselves engaged in the process of decoding what generative AI produces, often assigning significance where there may be none. Whether or not chatbots actually produce meaning (or sense) might turn out to be irrelevant.

This is where metaphor becomes not only illustrative but essential. By comparing the output of AI to the Knight’s absurd song-naming exercise, maybe we can help students and educators grasp the slippery nature of AI’s outputs. They are plausible, shaped, and inviting—but they are not necessarily true, and they do not possess intent or understanding.

So, what does this mean for teaching with or against generative AI? It means leaning into critical literacy more than ever. Students must learn to interrogate, contextualise, and, above all, question what they read—whether it’s a chatbot’s polished prose or a scholarly article (especially if that article has been written in part by AI). Educators, in turn, must equip them with the tools to discern the difference between something meaningful and something merely “meaning-shaped.”

I’m still on the fence when it comes to decisions about resisting or using AI in education. I’m not going to stop using AI, but I’m also not going to stop critiquing the systems that surround it. And I’ll continue to look for ways that we can use literature, metaphor, and the expertise of our field to both make the most of and push back against the technology.

Want to learn more about GenAI professional development and advisory services, or just have questions or comments? Get in touch:

← Back

Thank you for your response. ✨

Leave a Reply