What the National AI Plan Means for Schools

This week, the Australian Government released the National AI Plan. It’s a 37-page document that sets out how the federal government intends to position Australia in the global AI landscape. If you’re a school leader, a classroom teacher, or involved in education policy in any way, you should read it in preparation for your 2026 planning.

I’ve spent the past couple of hours poring over the document, and I’ll be honest: it’s both encouraging and concerning. Encouraging, because education actually features as a priority. Concerning, because much of what applies to schools is either implicit rather than explicit, or tucked away in references to other frameworks and policies which to date haven’t really been fleshed out (I’m looking at you, 2023 Australian Framework for Generative AI in Schools…).

This article explores what the Plan means for Australian schools, what’s written directly about education, what the implications are for classroom practice, and what school leaders need to be thinking about as we head into 2026 planning.

back view shot of students running to their classroom
Photo by RDNE Stock project on Pexels.com

What is the National AI Plan?

The National AI Plan 2025 was released this week (December 2025) by the Department of Industry, Science and Resources on behalf of the Albanese Government. It’s a whole-government framework that brings together existing AI initiatives and sets out new actions across three main goals: capturing the opportunities, spreading the benefits, and keeping Australians safe.

This isn’t Australia’s first attempt at AI policy. The Plan builds on the 2021 AI Action Plan and incorporates various commitments the government has made since 2022, including the Australian Framework for Generative AI in Schools, the AI Plan for the Australian Public Service (released November 2025), and over $460 million in existing AI funding. What makes this Plan different is its scope: it covers everything from data centre infrastructure to international partnerships, from worker rights to copyright law.

The Plan was written under the direction of Senator Tim Ayres (Minister for Industry and Innovation, Minister for Science) and Dr Andrew Charlton MP (Assistant Minister for Science, Technology and the Digital Economy). The Ministers’ foreword makes the government’s position clear: AI should “work for people, not the other way around,” and success will be measured by “how widely the benefits of AI are shared, how inequalities are reduced, how workers and workforces can be supported, and how workplace rights are protected.”

That framing – fairness, inclusion, opportunity – thankfully runs throughout the document. For schools, this is particularly relevant. Too often, we hear that education is being “left behind” and that the focus for AI should be an arms race to create bigger, better models and to power ever more “personalised” systems. AI is often positioned (by the companies that develop it) as a panacea to all problems, including “out of date education practices.”

I’ve written in the past about why AI needs to catch up with pedagogy, and not the other way around: teaching and learning isn’t the Victorian-era factory model that tech companies would have us believe it is. It doesn’t need “revolutionising” by AI, and teachers aren’t a homogenous collection of terrified luddites cowering behind lecterns. Out in the real work, a lot of great work is already happening in schools to create fairer and more inclusive opportunities with digital tech.

Image Source: Our National Plan on a Page https://www.industry.gov.au/sites/default/files/2025-12/national-ai-plan.pdf

Part One: Capture the Opportunities

The Plan’s first major goal is about building infrastructure, backing local capability, and attracting investment. While none of this is explicitly about schools, the implications are significant.

Infrastructure and Access

The Plan includes substantial investment in digital infrastructure: expanding the NBN, upgrading connectivity to regional and remote areas, supporting data centre development (Australia attracted $10 billion in data centre investment in 2024 alone), and working on submarine cable connections. The regional-metro divide in AI adoption – currently sitting at 29% regional versus 40% metropolitan – isn’t going to close without infrastructure investment. Speaking as a regional Victorian myself, with no access to NBN (or cellular – we live in a mobile blackspot too), there is no way that we can get close to a fair system without serious investment.

The Plan also mentions GovAI, a “secure, whole-of-government AI hosting service” for the Australian Public Service. Schools are government entities in most cases, which suggests we may eventually have access to this platform. If government schools can access secure, Australian-hosted AI services through GovAI, it could provide an alternative to commercial platforms and give schools more control over student data.

Building Local Capability

The Plan commits to backing Australian AI capability through research funding, the National AI Centre (NAIC), and a new “AI Accelerator” funding round through the Cooperative Research Centres program. There’s over $362 million in targeted research grants and $47 million for the Next Generation Graduates Program, which is explicitly about building a “pipeline of highly skilled professionals in AI and emerging technologies.”

That pipeline starts in schools. If we want Australian AI companies developing solutions for healthcare, agriculture, and advanced manufacturing, we need students coming out of secondary school with both foundational AI skills and, for some, specialist capabilities. The Plan notes that “demand for AI-skilled workers has tripled since 2015” and that “broad AI skills and credentials are essential across the workforce.” Schools can’t outsource this responsibility.

Case Studies Worth Studying

The Plan includes numerous case studies, including one on AI use in Kakadu National Park, where Bininj Traditional Owners are working with CSIRO, Parks Australia, and partners to use AI and drones for invasive species management. Machine learning analyses drone imagery to guide land management decisions, with cultural protocols central to the approach: Traditional Owners govern data use, rangers operate drones to avoid sacred sites, and information is shared under Indigenous data sovereignty principles.

This is exactly the kind of thing schools should be teaching about – not AI as abstract technology, but AI as a tool embedded in cultural, environmental, and governance contexts. It’s also a reminder that the Plan takes Indigenous data sovereignty seriously. More on that below.

Image Source: NextDC Data Centre, Wikimedia Commons

Part Two: Spread the Benefits

The Plan is explicit that “every Australian, regardless of age, location or gender” should benefit from AI. The Plan names specific groups requiring attention: First Nations people, women, people with disability, and regional communities.

The National AI Centre and Industry Support

The Plan positions the National AI Centre as the government’s lead body for supporting AI adoption. The NAIC has already released Guidance for AI Adoption (October 2025), which includes practical resources and editable AI policy templates. These resources have been “simplified in partnership with business.gov.au” to make them accessible to small organisations.

Schools should absolutely be using these resources. The NAIC isn’t just for businesses – the guidance applies to any organisation adopting AI, including schools. In fact, the Plan explicitly states that the NAIC will support “SMEs, not-for-profits, social enterprises and First Nations businesses adopt AI responsibly.” Schools often operate like not-for-profits in terms of resources and constraints, and the NAIC materials are designed to be adaptable and well worth a look.

https://www.industry.gov.au/national-artificial-intelligence-centre

First Nations Digital Inclusion and Data Sovereignty

The Plan makes several specific commitments regarding First Nations peoples. There’s a First Nations Digital Support Hub and Network of Digital Mentors to enhance digital skills and connectivity. More significantly, the government commits to upholding the Framework for Governance of Indigenous Data (NIAA 2024), which means “First Nations communities have control over the collection, access, use, and sharing of their data.”

If you’re a school using AI systems that involve First Nations students, their data, or Indigenous knowledge, you need to follow these principles. The Plan notes that “AI has manifested harms to First Nations people, including through perpetuating harmful stereotypes and the use, misattribution and falsification of First Nations cultural and intellectual property.” Schools must consult with Aboriginal and Torres Strait Islander communities about AI use, implement cultural protocols, and ensure data governance respects Indigenous sovereignty.

Skills, Training, and Professional Development

The Plan states clearly: “Industry, government, and the skills and education sectors all have a vital role in equipping students and the workforce to seize the opportunities AI presents.”

Several initiatives are directly relevant to K-12:

The National Skills Agreement identifies “ensuring Australia’s digital and technological capability” as a national priority. This covers vocational education and training, but it flows into schools through VET in Schools programs.

The FSO Skills Accelerator – AI (run by the Future Skills Organisation) is bringing together the VET sector and industry to “upskill teachers and trainers, provide training to learners, and collaborate with training providers.” The long-term goal is “a sustainable approach to AI skills development across the national skills and training system.” I spoke about the Accelerator on the podcast earlier this week, having attended the launch in August 2025.

The Digital Knowledge Exchange is “a national collaboration platform” for sharing best practice in digital skills and training across state and territory governments. If your education department isn’t plugged into this, they should be.

TAFEs and the Institute of Applied Technology are offering AI microcredentials, including a “Responsible AI microcredential” that has attracted over 150,000 enrolments to date. Secondary students should potentially have access to these credentials, or at least we should be thinking about how VET pathways prepare students for these qualifications.

The Plan also makes this explicit commitment: “The Department of Education and the Department of Employment and Workplace Relations will also explore ways to equip learners with the skills and credentials to participate in an AI-driven workforce and ensure a strong pipeline of AI-ready school leavers and graduates.

Public Services and GovAI

The Plan states that “in education, AI offers opportunities to reduce teacher workloads and improve student outcomes.” It notes that “states and territories are funded to trial GenAI, reducing workloads for teachers and exploring safe classroom use.”

Most importantly, it references the Australian Framework for Generative AI in Schools (Department of Education 2023), which “supports the responsible and ethical use of GenAI tools” and provides “nationally consistent guidance to students, teachers, staff, parents and carers on the opportunities and challenges presented by AI.”

If you haven’t read the Framework, you should (and this one’s only 7 pages long). It’s the key document for K-12 AI implementation in Australia, and although it has faced criticism for being under-resourced, it at least established an early commitment fro the federal government to allowing schools to safely explore AI use.

Part Three: Keep Australians Safe

The final section of the Plan focuses on managing risks, promoting responsible practices, and shaping international norms.

The AI Safety Institute

The government has established an AI Safety Institute (AISI) to “monitor, test and share information on emerging AI capabilities, risks and harms.” The AISI will provide “independent advice to ensure AI companies are compliant with Australian law and uphold legal standards around fairness and transparency.”

For schools, this means we’ll have an authoritative source for risk assessments of AI tools. The AISI isn’t replacing regulators – state and territory education departments retain responsibility for their sectors – but it will provide technical expertise and coordinate responses to emerging harms. If a new AI tool is causing problems, the AISI will be the body that identifies the risk and advises on mitigation.

The Plan identifies several harms that are absolutely relevant to schools:

Online Safety: The Online Safety Act 2021 addresses “AI-related risks through enforceable industry codes” and criminalises “non-consensual deepfake material.” The government is also considering “further restrictions on ‘nudify’ apps” and “reforms to tackle algorithmic bias.” These are urgent student safety issues. Deepfakes targeting students, AI-generated bullying content, and apps designed to create non-consensual images are all genuine risks in schools right now. The eSafety Commissioner continues to be a powerful force in Australia in these moves to regulate and remove these apps.

Copyright: The government has “ruled out a text and data mining exception in Australian copyright law” and is working with the “Copyright and AI Reference Group” to consider updates. For schools, this affects how we teach about AI and copyright, and may have implications for the future of the technology in Australia.

Privacy: The government is modernising the Privacy Act 1988 to achieve “the right balance between protecting people’s personal information and allowing it to be used and shared in ways that benefit individuals, society, and the economy.” Student privacy, parental consent frameworks, and how schools handle student data in AI systems will all be affected by these changes.

Being Clear about AI-Generated Content (NAIC 2025) provides guidance on transparency measures including labelling, watermarking, and metadata recording. We’ve of course spoken a lot about “labelling” with our AI Assessment Scale, and content credentialing and watermarking is making strides, with companies like Google adding robust watermarking to their image platforms and other GenAI applications.

The Plan also notes that the Australian Framework for Generative AI in Schools “provides nationally consistent guidance.” Again, if you haven’t engaged with the Framework, now’s the time…

What Does the Plan Mean for Schools? Policies, Processes, and Professional Development

As we head into 2026 planning, school executive teams need to be thinking seriously about AI strategy. The National AI Plan provides the policy context, but the work of implementation falls to schools. Here’s what you should be considering:

Policy Development

If your school doesn’t have an AI policy, you need one. If you have one, it probably needs updating. Start with the NAIC’s Guidance for AI Adoption and the Australian Framework for Generative AI in Schools. Your policy should cover:

  • Permitted and prohibited uses of AI by staff and students
  • Academic integrity expectations and how AI use must be declared
  • Privacy protections and data governance, including specific protocols for First Nations students and data
  • Risk assessment procedures for new AI tools
  • Transparency requirements for AI-generated content
  • Professional learning expectations for staff
  • Consultation processes with school community

Importantly, your policy needs to be more than a document that sits in a drawer. It needs to be living, reviewed regularly, and actually enforced. I’ve worked with dozens of schools in the past few years on producing and updating policies, and auditing existing school policies in line with documents like the Framework. Get in touch if you’d like to discuss.

Procurement: Who Decides Which AI Tools You Use?

Here’s something else many schools haven’t fully grasped yet: you may not get to choose which AI tools you use for much longer.

The NSW Department of Education has already piloted NSWEduChat, a purpose-built, department-owned generative AI tool. The trial began with 16 schools in Term 1 2024, expanded to 50 schools in Term 2, rolled out to all staff in Terms 3 and 4 2024, and is now available to all Year 5-12 students in NSW public schools. NSWEduChat is hosted in Azure’s Sydney data centre, uses a blend of OpenAI models, includes eight different content filtering systems, and explicitly refuses to provide direct answers – instead responding with guiding questions to encourage critical thinking.

The NSW Department describes NSWEduChat as “currently the safest and most cost-effective generative AI tool available for our schools and staff” and states that “the use of individual free-to-use generative AI tools other than NSWEduChat is neither endorsed nor recommended by the department.” South Australia is doing something similar with EdChat, which rolled out to all public high schools from October 2025.

This suggests that state governments can and will dictate which AI systems government schools are allowed to use. At the moment, procurement of AI in schools is the Wild West. Many schools, under the influence of sometimes aggressive sales teams, are declaring their allegiance to existing companies such as Microsoft or Google, investing in Copilot subscriptions or Google Workspace AI features. Schools may find themselves in a situation where they are told to use a specific, state government-endorsed chatbot, and all those Microsoft or Google investments become redundant or prohibited.

Catholic and independent schools face a slightly different situation – you have more autonomy over procurement decisions – but you should still be watching what state governments are doing. If NSW and SA can build their own AI tools, diocesan systems or independent school associations might do the same. At minimum, you should be ensuring that any AI tools you procure are aligned with the Australian Framework for Generative AI in Schools and can demonstrate compliance with Australian privacy law and Indigenous data sovereignty principles.

My advice: don’t make long-term commitments to commercial AI platforms right now unless you’re confident they align with where government policy is headed. And if your state education department releases its own AI tool, pay very close attention.

Professional Development: A Three-Dimensional Approach

I have written previously about the need for balance when discussing professional learning and AI. The model I outlined – developing technological expertise, domain expertise, and situated expertise simultaneously – aligns well with what the National AI Plan is proposing.

For teachers, this means we need:

  1. Technological expertise: Understanding how AI works, what its limitations are, how to use it ethically, and how to teach students to use it responsibly.
  2. Domain expertise: Knowing how AI applies specifically to your subject area – how it’s used in scientific research, literary analysis, mathematical modelling, or whatever discipline you teach.
  3. Situated expertise: Understanding your classroom context, your students’ capabilities, institutional policies, and professional identity as a teacher who may or may not choose to integrate AI into pedagogy.

The Plan also emphasises that “workers and unions must have a strong voice in how AI is adopted across workplaces” and that deployment should involve “meaningful consultation with workers, including into the design of AI systems.” This applies to schools. Teachers aren’t just implementing AI policy – we should be helping to design it. If your school or system is rolling out AI tools without consulting teaching staff, that’s a problem.

For 2026 planning, every school should be:

  • Auditing current staff AI capabilities and comfort levels
  • Providing differentiated professional learning based on staff needs
  • Creating opportunities for staff to experiment with AI tools safely
  • Building AI literacy into curriculum planning across all key learning areas
  • Establishing communities of practice where staff can share successes and failures
  • Ensuring that professional learning isn’t just “here’s how to use ChatGPT” but includes ethical considerations, limitations, and critical perspectives

The Plan also commits to “reviewing Work Health and Safety laws” in relation to AI, with Safe Work Australia including AI in the best practice review. For schools, this is a reminder that AI use in workplaces (including schools) needs to consider psychosocial risks, teacher wellbeing, and the impact of AI tools on workload and autonomy.

Student AI Skills Across All Subjects

The Plan is clear that “broad AI skills and credentials are essential across the workforce” and that specialist AI expertise is needed for development and deployment. Schools need to provide both. This means:

  • Foundational AI instruction for all students: Understanding what AI is, how it works, its benefits and limitations, ethical considerations, and how to use it responsibly. This should be integrated across the curriculum, not siloed in one subject.
  • Specialist AI pathways: For students interested in computing, data science, or AI development, there should be clear pathways through electives, VET qualifications, and senior subject choices.
  • Application-specific learning: Every subject should be teaching students how AI applies in that discipline – how historians use AI to analyse texts, how scientists use it in research, how artists engage with generative tools.

The Plan notes that the government is “expanding access to AI and digital skills” and that “Jobs and Skills Australia provides evidence-based analysis of labour market trends and skills needs” including “studies on how generative AI is reshaping job roles.” Schools should be using this data to inform careers education and subject selection guidance.

Assessment and Academic Integrity

The Plan doesn’t specifically address assessment, but the implications are clear. If students have access to AI tools – and they do, whether through official school channels or their own devices – then assessment practices need to evolve.

The NAIC guidance on “Being Clear about AI-Generated Content” is relevant here. Schools need frameworks for:

  • How students declare AI use in assignments
  • What constitutes appropriate versus inappropriate AI use for different tasks
  • How to assess AI-assisted work fairly
  • Teaching students to use AI as a tool for learning rather than a shortcut to answers

Equity and Access

If you’re in a regional or remote school, or a school with significant numbers of students from low SES backgrounds, the digital divide is likely already affecting your ability to respond to AI. The Plan commits to infrastructure upgrades and digital inclusion programs, but you can’t wait for those to trickle down. Your 2026 planning needs to include:

  • Device access programs if you don’t already have them
  • Digital literacy programs that are prerequisites for AI literacy
  • Partnerships with libraries, community centres, or other organisations that can provide connectivity
  • Creative solutions for students without home internet access

For schools with significant First Nations student populations, the Plan’s commitments to Indigenous data sovereignty should also shape your approach. This means:

  • Consulting with local Aboriginal and Torres Strait Islander communities about AI use
  • Ensuring AI tools don’t perpetuate harmful stereotypes
  • Protecting Indigenous knowledge and cultural protocols
  • Implementing the Framework for Governance of Indigenous Data in your AI policies

The Plan also notes gender gaps in technology, with “Working for Women: A Strategy for Gender Equality” influencing skills development. Schools should be actively encouraging girls into AI and technology subjects, countering stereotypes, and ensuring female role models are visible in AI education.

Governance and Accountability

Finally, the Plan describes how every APS agency will have a “Chief AI Officer to drive adoption” with “AI use tracked and reported on.” Schools don’t necessarily need to create new positions, but you do need clear accountability structures:

  • Who is responsible for AI policy in your school?
  • Who reviews and approves new AI tools?
  • Who monitors for harms or unintended consequences?
  • Who leads professional development?
  • Who engages with parents and community?

For many schools, these responsibilities will fall to existing roles – Director of Teaching and Learning, Head of Digital Technologies, eLearning Coordinator – but those people need support, time, and authority to do the work properly.

A Final Thought

The National AI Plan 2025 is much more ambitious than earlier frameworks and documents. It commits to building infrastructure, spreading benefits equitably, and keeping Australians safe while positioning Australia as a regional leader in responsible AI development and adoption.

The opportunities are seductive: better tools for teachers, more engaging learning for students, improved services, and pathways to careers that don’t exist yet. But the obligations and challenges are equally important: we need to protect student privacy and safety, address the digital divide, respect First Nations data sovereignty, update our assessment practices, and ensure that AI enhances rather than replaces human teaching.

As you head into 2026 planning, don’t treat AI as an optional extra or a technology fad that will pass. The federal government has made it clear that AI is a national priority, and education is central to that strategy. Read the Plan. Read the Australian Framework for Generative AI in Schools. Use the NAIC resources. Talk to your staff, your students, and your community. Build policies that are both ambitious and cautious. And remember: the goal isn’t to use AI for AI’s sake – it’s to use it in ways that genuinely serve your students and your school community.

Want to learn more about GenAI professional development and advisory services, or just have questions or comments? Get in touch:

Go back

Your message has been sent

Warning
Warning
Warning
Warning
Warning.

3 responses to “What the National AI Plan Means for Schools”

  1. […] National AI Plan, Furze emphasizes the need for policies that prioritize human oversight. (Source: https://leonfurze.com/2025/12/03/what-the-national-ai-plan-means-for-schools/)Privacy is another flashpoint. AI tools collect vast student data, raising GDPR-like concerns under […]

Leave a Reply