When ChatGPT landed on our doorsteps in November 2022, it largely slipped beneath the education radar. By the time term 1 2023 rolled around, however, it’s fair to say that the situation had changed. Most states in Australia banned the technology in Department schools, and those bans remained in place for several months.
Despite the bumpy start, policy is quickly catching up with the reality of these technologies. We’re now starting to see some guidelines emerging from various states and sectors for the appropriate use of AI in secondary education. In the tertiary sector, we’re also seeing some fantastic research beginning to emerge which accounts for Generative AI and which can be adapted to the secondary context.
AI across the states
While we still don’t have a national policy for AI in education (it’s on its way…), several states have released guidelines. The NSW Department of Education, SACE, and QCAA all have dedicated pages with AI advice and I’m sure more will come soon:
Each of these policies has strengths and limitations, but they all offer a tentative first step towards clear approaches for secondary schools. All three institutions share a common view of generative AI as a tool with substantial potential for learning and assessment purposes, but caution against uncritical use. While the SACE Board and the NSW Department make specific references to ChatGPT, the QCAA addresses AI technologies more broadly, encompassing chatbots, deep learning, machine learning, and natural language processing.
The guidelines from all three entities stress the importance of creating original work and not presenting AI-generated content as the student’s own. They agree that AI usage should be cited, with specific details provided when the generated content is quoted or paraphrased. This emphasis underscores the shared commitment to academic integrity.
There are, however, differences in the approach to assessment practices. The SACE Board allows AI in school-based assessments and task design but prohibits its use in external exams. The NSW Department focuses more on the need for staff to verify the accuracy and suitability of AI content, and the QCAA recommends incorporating guidelines on AI usage into school assessment policies.
The importance of data privacy and accuracy is highlighted differently across the guidelines. The NSW Department provides a detailed approach to safeguarding personal information, including de-identification and anonymisation, while the QCAA suggests educating users about potential privacy and accuracy issues associated with AI use. The SACE Board does not explicitly address this issue.
Finally, all institutions advocate for continuous learning and adaptation when it comes to writing school policies and to using these technologies. The NSW Department emphasises mandatory staff training in cybersecurity and child protection, while both QCAA and SACE recommend schools use their existing Academic Integrity courses and resources to discuss ethical scholarship with students.
The AI Assessment Scale
Earlier in the year, I made a post based on a discussion with the staff of Edith Cowan University, where I suggested that AI in Assessment could be placed on a scale from “no AI” to “full AI”. The post has become one of my most successful, and has already been adapted for both K-12 and tertiary.

The AI Assessment Scale aligns well with all three states’ initial policies for using AI in secondary education. When considering the “No AI” level on my scale, each set of guidelines acknowledges that there are certain contexts, such as external examinations (as stipulated by the SACE Board), where the use of AI is inappropriate or even prohibited. All guidelines underscore the paramount importance of students producing their own work and academic integrity, and caution against presenting AI-generated content as original student output.
The “Brainstorming and ideas” and “Outlining and notes” levels of my scale also share common ground with these guidelines, which suggest AI can be a beneficial tool for research and development of ideas. For instance, the SACE Board endorses AI as a resource for gathering information, and the QCAA advocates for its use in developing vital 21st-century skills.
When we look at the “Feedback and editing” level of my scale, it works well with the NSW Department’s guideline that stresses the verification of AI-generated content by staff for accuracy and suitability. This level mirrors the concept of AI being used as a tool to enhance the quality of students’ work by providing feedback and identifying areas for improvement, but not necessarily replacing the students’ work outright.
Finally, the “Full AI” level on my scale, where AI is used to generate the entire output, finds partial alignment with the SACE Board’s policy, which supports AI as a research tool and for use in school-based assessments. However, it is crucial to note that these guidelines also caution against presenting AI-generated content as a student’s original work so the purpose task would need to be clearly articulated. The NSW Department’s strong emphasis on safeguarding personal information could also come into play at this level, given the substantial amount of personal data a full AI-generated output might require. This level highlights potential areas of conflict with these guidelines, demonstrating the importance of clear policy and ethical considerations when integrating AI into education.

Adapting Tertiary Approaches
Earlier this month, Deakin’s Centre for Research in Assessment and Digital Learning (CRADLE) released an excellent short paper on AI assessment at a tertiary level. I think that much of the content could be readily adapted to the secondary school context, and that it should play an important role in creating school assessment and academic integrity policies.
The report discusses the influence of generative Artificial Intelligence on assessment practices. It states that readily available AI resources like ChatGPT can lead to better assessment methods and stresses the need to prepare students for a world influenced by AI. The report suggests principles of assessment design for a GAI era, and discusses how current assessment practices should adapt to GAI.
CRADLE’s recommendations include: ensuring assessments verify learning outcomes, designing feedback sequences, developing student capability to judge quality, providing multiple submission formats, focusing on evidencing outcomes, having open conversations about GAI, reviewing rubrics and assessment criteria, specifying when it’s appropriate to use GAI, designing tasks to highlight students’ unique achievements, and developing and assessing critical digital literacies.
CRADLE’s report overlaps with the secondary school state policies in lots of areas:
- Ethical use of AI and student education: All three state policies and the CRADLE report emphasise the ethical use of AI tools and stress the importance of student education about these tools. This involves understanding their potential limitations and the necessity of authentic work creation.
- Assessment integrity: Both the state policies and CRADLE highlight the need to maintain academic integrity in assessments, whether through acknowledging the use of AI tools in the assessment process or ensuring the work presented is a student’s own.
- Use of AI for learning enhancement, not replacement: The guidelines suggest the use of AI as a tool for research and learning enhancement, rather than as a means to replace student effort. This is echoed in the CRADLE report, which advocates for genAI to be incorporated into task designs and assessment methods that promote students’ unique achievements and critical thinking.
- Clear communication and open discussions about AI use: CRADLE’s report prioritises having open conversations with students about the use of GAI in sanctioned ways, how it may or may not breach academic integrity rules, and its limitations. This aligns with the state policies which advocate for clear guidelines and rules concerning the use of AI in schools.
- Privacy concerns: The Queensland and NSW policies stress educating users about potential privacy issues related to AI use, which aligns with the CRADLE’s emphasis on the need for students to understand ethical complexities associated with new technologies.
- Incorporating AI tools into task design and assessment methods: Both the SACE Board and the CRADLE report suggest integrating AI tools into task design and assessment methods. This includes devising multiple submission formats and designing tasks that allow students to demonstrate their unique achievements.
Secondary School Assessment Policies
I’m currently working with a number of schools on AI policy, including assessment, academic integrity, and ethics. These new state policies and research from the tertiary sector is proving very useful in all of those discussions. But every school will need to adapt their own policies and processes to suit their students and community. A while back, I wrote a post about AI policy in secondary schools, and I took the approach of a series of questions or prompts that any school could use to develop its own policies.
I’m going to do the same here. In the interest of transparency, here’s my process: I’m taking this entire blog post so far, plus my AI assessment scale, and passing it through ChatGPT with the following prompt:
Using the blog post, state policies, tertiary information from CRADLE, and my AI assessment scale, create a series of questions which a secondary school could use to develop its own AI assessment policy. It should be adaptable enough to work in all faculties. In this AI assessment policy, generative AI should include not just ChatGPT but also image, audio, and video generation.
ChatGPT prompt
I have then edited the questions for accuracy and to make them relevant to secondary schools across Australia.
Now that ChatGPT allows sharing, you can also view the entire chat thread here: https://chat.openai.com/share/e32f716f-7e22-4dcd-a2e7-60190a42f2d3
Developing AI Assessment Policy: Questions for School Leadership
- What is the school’s overall stance on AI usage in secondary education? Is there a unified view on this issue among staff, students, and parents, or are there conflicting perspectives that need to be reconciled? How do you know?
- How does the school define “generative AI”? Does it include only text-generation tools like ChatGPT or is it extended to include image, audio, and video generation?
- How will the school approach the ethical use of AI, including the issue of academic integrity? How will the school ensure that students understand and abide by these ethical guidelines?
- What policies will the school implement to ensure student data privacy when using AI tools? How will these policies align with the guidelines proposed by the education department, which emphasise safeguarding personal information?
- How will the school educate students and staff about the potential privacy and accuracy issues associated with AI use, as suggested by the QCAA and CRADLE’s report?
- To what extent will AI tools be incorporated into assessment practices? At what levels of the AI Assessment Scale will AI usage be permitted and in which situations would it be inappropriate or prohibited?
- How will the school manage the use of AI in brainstorming, idea generation, outlining and notes, feedback and editing, and full AI generation? How will these different levels of AI integration be monitored and evaluated?
- How will AI-generated content be cited and acknowledged in student work to maintain academic integrity? What penalties will apply for failing to appropriately cite AI-generated content?
- How will the school ensure that AI is used as a tool to enhance student learning, rather than replace student effort, as recommended by the state policies and CRADLE’s report?
- How will the school communicate its AI policies and guidelines to students, parents, and staff? What mechanisms will be in place for revising and updating these policies as AI technologies and their usage evolve?
- How will the school integrate AI tools into task design and assessment methods? How will it ensure that these tasks allow students to demonstrate their unique achievements and critical thinking skills, as suggested by CRADLE’s report?
- How will the school support staff in acquiring the necessary skills and knowledge to effectively use AI tools in teaching and assessment practices?
- How will the school measure the impact of AI on teaching and learning outcomes? What mechanisms will be put in place for continuous evaluation and improvement of AI usage in the school?
- How will the school manage any potential challenges or conflicts that may arise from integrating AI into teaching and assessment practices?
- How will the school prepare students for a world increasingly influenced by AI, as suggested by CRADLE’s report, including the development and assessment of critical digital literacies?
I’ve also created a mock school policy which “answers” those questions and could be discussed alongside your existing school policies. It’s too generic to use out of the gate, but it might prompt a few ideas, particularly if used alongside the state documents and the AI Assessment Scale:
I hope this post is useful in conversations about AI assessment in secondary schools. We’re still a little way off from a consistent national approach (and given we can’t agree on a National Curriculum, and we already have one of those, I’m not holding my breath) but this should go some way to developing your own school’s approach.
I’m running professional development and advisory services in Semester Two for AI and assessment. If you’d like to discuss working with your school, please get in touch:
Leave a Reply