A few weeks ago, I wrote about the techniques I use when incorporating AI into my writing process. The article included a description of which apps I use to get these blog posts published. Voice to text is my favourite AI technology and the way I write most of my articles while I’m on the move, but as I was writing the article, I realised that the technical discussions often leave out large parts of the process.
In this article, I’m broadening away from technical discussion to look at the writing cycle and to untangle where large language model based technologies really cause issues in the process.
When I teach writing, I refer to a cycle from my book, Practical Writing Strategies, co-authored by Ben White. In the book, we outlined a six part writing cycle, and articulated that both the process and the product are important. When teaching writing, particularly in academic contexts, we often seem to value and assess the product, and leave the process to fend for itself.

When we wrote the book, technologies like ChatGPT were only just reaching mainstream attention. Here’s what we wrote back in 2022-23:
Increasingly, students can also collaborate with non-human ‘co-authors’, including artificial intelligence (AI) writing apps and ‘spinners’ – apps which paraphrase any text that is dropped into them. These technologies are already out there, and your students are likely using them. Rather than policing the technologies, we suggest finding ways to work with them ethically and appropriately…
Two and a half years on from the release of ChatGPT, and anybody that teaches writing could probably tell you how that prediction has panned out.
But I want to put aside the moral panic of the death of writing and re-centre the discussion on process, and valuing how learners develop writing skills, ideas and approaches to different genres and styles of writing. We have to accept that artificial intelligence is part of that process.
Much of our written language has been technologically mediated for a long time. Without getting into philosophical debates about whether a pen counts as a technology, I’m referring specifically to digital technologies, from word processors and their impact on formatting and style, to the internet and the impact on research, community, sharing of ideas and self-publication.
Generative artificial intelligence feels new because these earlier technologies couldn’t do the writing for us, no matter how many times we hit Shift+F7 and fumbled with Microsoft Word’s thesaurus.
In this article, I’m also going to neatly sidestep the many valid ethical concerns with the technology. I’m not going to talk about copyright, bias, environmental impacts, or the centralisation of power represented by these technologies. Of course, these discussions are worthwhile. I’ve written about them elsewhere, but at the end of the day, people are using this technology to write. Students will need to understand how to use these technologies well, and educators have a responsibility for having these discussions out in the open. Students can’t be expected to learn everything from TikTok.
The Writing Process
In Practical Writing Strategies, we split the writing process into six stages: purpose, exploration, ideas, skills, collaboration and publication. This is fairly self-explanatory if you teach writing, and rather than going through every stage again in detail, I’ll point you towards other articles, and of course, the book itself (great book, rush out and buy it right now).
In 2024, I wrote a series of articles which aligned the writing cycle to ideas for using generative AI platforms like ChatGPT, Gemini, and various image generators. These have also formed the basis for a course I’m working on which will be released later this year alongside my other AI courses.
- Teaching AI Writing: Purpose
- Teaching AI Writing: Exploration
- Teaching AI Writing: Ideas
- Teaching AI Writing: Skills
- Teaching AI Writing: Collaboration
- Teaching AI Writing: Publication

The Practical AI Strategies online course is available now! Over 4 hours of content split into 10-20 minute lessons, covering 6 key areas of Generative AI. You’ll learn how GenAI works, how to prompt text, image, and other models, and the ethical implications of this complex technology. You will also learn how to adapt education and assessment practices to deal with GenAI. This course has been designed for K-12 and Higher Education, and is available now.
What I want to do in this article is highlight a few places that artificial intelligence intersects with the first stage of the writing cycle, and where it causes problems.
Flattening the Cycle
Our writing cycle didn’t come from nowhere. And of course, there are many similar approaches to teaching the process of writing. One discussion of writing process that I like is from Gunter Kress and Theo van Leeuwen’s Multimodal Discourse, where they talk about the strata of production. They highlight that successive technologies, from the printing press to the internet, flatten the strata of production, and I have written elsewhere about how this might be applied to GenAI.
One issue presented by generative AI is that it can squash the entire process of writing, flattening not just the strata of design, production and distribution, but short-cutting the process entirely from purpose (“I want to write a blog post…”) to publication (“…and here it is”).
In an academic context, this is problematic because the strata of production in education stands in as a proxy for learning. Flatten the process, flatten the learning.
And of course, there are situations where flattening this process is not only expedient, but actually beneficial. Consider the market forces that push people to produce content on a huge scale: clickbait farms, automated news sites and blogs which regurgitate other media to generate ad revenue, and chumboxes (great word though). From a purely economic standpoint, generative artificial intelligence’s flattening of the process is actually fantastic, theoretically shortening the distance between click to dollar.

There are other times when it might be expedient and effective to flatten the process of writing, such as when your boss sends you an email that you don’t know how to respond to politely, or when you have to produce a piece of writing that is purely functional, conveying information as quickly as possible.
In short, generative AI’s flattening of the process of writing is beneficial when the purpose of the writing is arbitrary, undesired or a vehicle to achieving some other ends, such as click farm revenue.
And in education, we’ve created a lot of these scenarios.
Much of the writing we require students to do doesn’t have to be writing. It could be oral, could be a discussion, could be an observation or demonstration of something practical, rather than a write-up of a step-by-step process. Because writing is expedient and stands as more-or-less-decent proxy for learning, it has become the default mode of many assessments. But that abstracts the purpose of the writing: the purpose of the writing is not the writing itself. The purpose of the writing is to achieve the thing that would otherwise be better done through a different medium. Students aren’t stupid, and they can tell when a writing task is arbitrary.
The second problem represented by the flattening of the process is that in many cases, the process doesn’t exist in the first place. We tell students that there is a process, but then we fail to teach them what that process actually is. We don’t offer support, guidance or evaluation, and we don’t really value the process at all. This happens often in my subject, English, where we talk about the importance of drafting and editing, but both the teachers and students get the sense that we’re in a race to the finish. When it comes to the final assessment moment, we’re only really interested in the product. Again, process is arbitrary.

Getting it right
Of course there is no silver bullet for writing instruction, especially with GenAI in the mix. But there are ways to make writing in education more purposeful, or to rethink when and where we use writing as an assessment item.
First of all, we need to acknowledge that the purpose of writing needs to be uncoupled from assessment. We can’t keep treating writing as an expedient proxy for learning. If nothing else, it no longer is expedient. The workload of educators has increased since 2022 in wading through AI-generated essays, and giving more essays as assessment won’t solve that.
Secondly, we need to hold process in higher regard. In future posts, I’ll write more about the rest of the writing cycle. Until then I strongly recommend you follow Jason Gulya on LinkedIn who is a writing educator grappling with these issues. He’s doing it all in the open and sharing his resources, successes, and challenges very publicly.
Finally, and even in language arts, literature, humanities and English subjects, we need to identify opportunities to move beyond a final piece of writing as the end-point of a project. Out in the wild, a finished piece of writing – a book, a blog article, a social media post – is rarely the end-point.
Books are discussed, shared, reviewed, critiqued. Blogs and social media post are designed to be commented on and engaged with. Maybe part of our process also needs to extend beyond the publication stage.
I’d love to hear your ideas about how we can re-centre the importance of process in writing. Get in touch below.
Cover image: Leo Lau & Digit / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
Want to learn more about GenAI professional development and advisory services, or just have questions or comments? Get in touch:
Your message has been sent
Subscribe to the mailing list for updates, resources, and offers
As internet search gets consumed by AI, it’s more important than ever for audiences to directly subscribe to authors. Mailing list subscribers get a weekly digest of the articles and resources on this blog, plus early access and discounts to online courses and materials. Unsubscribe any time.

Leave a Reply