Case Study: Saskatoon Public Schools and AI Assessment

a green and blue abstract painting with waves

One of the most satisfying things about releasing a framework like the AI Assessment Scale under an open licence is watching how educators and education systems take it and genuinely make it their own. The AIAS is now in use across more than twenty countries, and every implementation I hear about teaches me something new, whether that is about the language a system chooses, the visual design decisions they make, or the policy conversations that develop around it.

This case study, from Saskatoon Public Schools in Saskatchewan, Canada, is one of the richer examples I have come across, and a great illustration of what thoughtful, division-wide adaptation can look like.

Adapting the AIAS

In early 2025 Candace Elliott-Jensen, Coordinator of Secondary Learning Supports at Saskatoon Public Schools, got in touch after encountering the AIAS through a colleague. One of her teachers, Jo Boots, had connected with me on LinkedIn the previous year to share feedback on the framework; Jo then shared the work with Candace, who was beginning to lead a team developing division-wide AI guidelines for teachers, parents, and students.

Candace and I briefly discussed the AIAS, and the possibility of adapting the scale for their use. The AIAS is released under a permissive open license, and all of the academic publications are available in Open Access journals.

Candace got in touch again in January 2026 with an update. Over the course of the year, her team had adapted the AIAS into language and visuals that fit the Saskatoon secondary context, and in autumn 2025 they released it as a visual in every high school classroom across the division. Teachers had begun using it as a shared reference point for conversations with students about genAI, academic integrity, and demonstrations of learning; families and students had responded positively; and the team was now looking ahead to an elementary version.

The most interesting decision they made was to move away from the AIAS’s original hierarchical structure. Rather than presenting the categories as a scale, the Saskatoon team treat them as parallel options, each suited to different learning purposes, and their visual design deliberately mixes the icons rather than numbering them or arranging them as steps on a ladder. They also introduced the practice of placing the icons directly on assignments, so that students can see at a glance which kinds of AI use are appropriate for a given task.

Infographic on the use of Generative AI (GenAI) in Saskatoon Public Schools high schools, detailing various approaches such as oversight, editing, partnership, and independent learning.
Saskatoon Public Schools Adaptation of the AIAS, 2025

I asked Candace if she would be willing to answer a few questions about the project, and she generously agreed. What follows is her own account of the work, lightly edited.

Q&A with Candace Elliott-Jensen, Saskatoon Public Schools

1. What was the conversation around AI in your high schools like before you adapted the AIAS framework? What were teachers and students struggling with most?

Before we adapted the AIAS framework, conversations about AI in Saskatoon Public Schools secondary classrooms tended to fall into a fairly binary space: is AI allowed or not? That framing created understandable uncertainty for teachers and mixed messages for students. Teachers were thinking deeply about issues such as academic integrity, the reliability of AI-generated information, privacy, and how to respond when AI might be used inappropriately. At the same time, many recognized that trying to ban or block these tools was neither realistic nor particularly educational.

Students were navigating similar uncertainty. Many were already experimenting with AI, but were often unsure where the line was between appropriate support and academic dishonesty. The result was that both teachers and students were operating without a shared language for talking about them.

At a system level, Saskatoon Public Schools had already been clear that we do not advocate banning or blocking generative AI tools. Our emphasis has been on responsible, ethical, and informed use aligned with our existing policies and digital citizenship expectations. What we were missing, however, was a practical instructional framework that could help teachers communicate expectations at the classroom level and design learning tasks that acknowledged the presence of AI with shared language. The challenge was less about willingness to engage with AI, and more about helping teachers feel confident in how to structure learning, clarify boundaries, and assess student thinking in an environment where AI is increasingly present.

2. Can you walk me through how your team adapted the scale for the Saskatoon context: what did you keep, what did you change, and why?

We were initially drawn to the AIAS because it provided a clear way to categorize different types of AI use and shifted the conversation away from a simple “allowed versus not allowed” approach toward thinking about instructional purpose. That structure helped organize the conversation and gave teachers language to talk about how AI might support learning.

We realized that presenting the model as a linear scale was not the best fit for how we wanted these conversations to unfold in our classrooms. Rather than suggesting that AI use increases or decreases in value along a continuum, we emphasized that the categories are distinct and may be appropriate for different learning situations. In practice, this means that not using AI or using AI for idea generation, editing, and collaboration are not inherently better or worse than one another. Each serves a different purpose depending on the learning task and the instructional goal. Our visual design reflects this thinking: the icons are intentionally not numbered nor arranged as steps on a ladder, but instead appear in a mixed layout to signal that they represent parallel options rather than a progression.

One of the most practical adaptations we introduced was the use of visual icons that teachers can place directly on assignments or learning tasks. These icons help clarify which types of AI use are appropriate for a particular activity and provide students with clear expectations from the outset. The addition of the icons helps teachers communicate expectations quickly and clearly, while also prompting richer instructional questions such as: What role, if any, should AI play in this task? How does that use support the learning goal? What does transparency look like in this context?

While our design moves away from a hierarchical scale, the strength of the original AIAS framework remains central to our work. Its ability to clearly name and categorize different ways AI can support learning provided the foundation for our adaptation and helped structure more thoughtful conversations with teachers about purpose, transparency, and student agency.

3. You mentioned the framework shifted discussions from “Can students use AI?” to richer conversations about purpose and transparency. Can you give a specific example of what that looks like in practice: a particular class, subject, or moment that stood out?

One way this shift has shown up in practice is through modeled lessons that we’ve shared with teachers. For example, in an English Language Arts context where students are writing personal narratives, we’ve modeled framing the task using the “GenAI for ideas and structure” purpose. In this approach, students are invited to use AI to help generate possible themes or organize their thinking into a rough outline, while the narrative itself is written independently. As part of the process, students are also expected to briefly disclose how AI supported their planning.

Teachers have reported that when learning is framed this way, the conversation during conferencing and reflection changes significantly. Rather than focusing on whether students used AI, questions shift toward purpose and process, such as:

  • What role did AI play in your process?
  • What did the tool help you think through or organize?
  • Where do we see your own thinking and voice in the final piece?

This shift moves the focus away from trying to detect AI use and toward supporting students in thinking more intentionally about their learning process. Students are able to articulate how AI supported idea generation or organization, while the final piece reflects their own experiences, voice, and decisions as writers. In that sense, the framework helps normalize transparent use while still keeping the emphasis on student thinking and authorship.

4. How have students themselves responded? Has having a shared language around AI use changed the way they talk about their own learning or academic integrity?

Having a clearer framework that names legitimate purposes for AI use has begun to change how students talk about their own process. When expectations are explicit, students appear more comfortable being transparent about how they used a tool and where their own thinking fits within the work.

We are starting to hear students describe their process in ways that reflect that shared language. For example, students will say things like:

  • I used AI to explore some ideas, but I wrote the final response myself.
  • I used AI to help edit my writing, and here are the changes I decided to keep.

Those kinds of conversations move the focus away from simply asking whether AI was used and toward understanding how it was used and why. In that sense, academic integrity discussions begin to shift from a compliance mindset toward one that emphasizes skill development, ethical decision-making, and ownership of learning. Students are starting to see integrity less as “avoiding AI” and more as being honest, thoughtful, and responsible about how tools support their work.

At the same time, we are still very much in the early stages of this work. The framework has helped open the door to more productive conversations, but we know that achieving consistency across classrooms will take time. Teachers are continuing to experiment with how to apply the framework in different subjects and learning tasks, and part of our ongoing work will be supporting that shared understanding so that students experience clearer and more consistent expectations across courses.

5. As you begin adapting the framework for elementary, what’s your early thinking on what needs to change for younger learners, and what do you think stays the same?

As we begin thinking about an elementary version of the framework, our intention is to keep the overall structure and icons consistent with what we are using in secondary schools. Maintaining that continuity helps build a shared language across the division so that, over time, students encounter the same ideas about AI use as they move through the grades.

What will likely change is how those ideas are expressed. With younger learners, the language will need to be simpler and more concrete, and the examples will need to connect directly to classroom activities that are familiar at the elementary level. The visual design will also play a larger role, with more student-friendly imagery and clearer cues that help younger students quickly understand the intent.

Another important difference is how the framework will be used in practice. In secondary classrooms, students can often independently interpret the icons and expectations connected to an assignment. In elementary classrooms, the framework will likely be introduced more through teacher-guided conversations and modeling, where the teacher helps students understand how and why a tool might be used during learning.

What remains consistent is the core message behind the framework. AI is positioned as a tool that can support learning, not replace thinking. Transparency about when and how tools are used remains important, and the goal is to begin building early habits around responsible and thoughtful technology use. Over time, that shared foundation can help students develop a clearer understanding of how AI fits within learning as they move through the K–12 system.

A few reflections

A few things stand out to me in Candace’s account. The first is the shift from a compliance framing toward one built around purpose, transparency, and student authorship. The second is the decision to remove the hierarchy and treat the categories as parallel options; Saskatoon’s reasoning here is sound, and it is something we also discussed in our “version 2” assessment scale. The third is the consistency plan across K–12: keeping the icons stable across the grades, while adjusting language and pedagogy, is an ambitious commitment to a shared language over many years of a student’s schooling.

I am very grateful to Candace and her team for taking the time to share this work, and I am looking forward to seeing the elementary version once it is launched. If you are adapting the AIAS in your own system and have a story to share, I would love to hear it; please get in touch below.

Want to learn more about GenAI professional development and advisory services, or just have questions or comments? Get in touch:

← Back

Thank you for your response. ✨

Want to learn more about GenAI professional development and advisory services, or just have questions or comments? Get in touch:

← Back

Thank you for your response. ✨

Leave a Reply