Through the FSO Skills Accelerator-AI, NextEd and TAFE SA transformed shared concerns about assessment authenticity into practical resources now available for RTOs across the sector.
At a glance
FSO Skills Accelerator-AI initiative: Assessment Authenticity in the Age of AI
Timeframe: Skills Accelerator Action Learning Sprint 1, Oct-Dec 2025
Primary goal: Equip educators with frameworks and tools to confidently manage AI use in student submissions.
Key outcomes
- Educators gained confidence in identifying AI use when reviewing assessments.
- Stronger alignment between learning design teams and compliance staff.
- Development and adoption of practical assessment authenticity tools for the sector.
Download the Assessment Authenticity Risk Matrix and Sample Analysis.
The challenge
As AI tools became increasingly accessible to learners, both NextEd and TAFE SA observed a rapid rise in assessment submissions that may have been partially or fully generated using AI technologies.
Educators raised concerns about maintaining assessment authenticity, identifying AI-generated submissions with confidence, and ensuring assessments remained fair and reflective of genuine learner capability. The lack of consistent frameworks or tools, combined with increased time reviewing questionable submissions, created an urgent challenge.
As assessment integrity is central to quality training and compliance, both organisations recognised that AI capability uplift was now mission-critical, not only for teaching and learning, but for protecting the credibility of qualifications.
This challenge became the catalyst for a joint knowledge-sharing initiative, supported through the FSO Skills Accelerator-AI.
About the initiative
NextEd and TAFE SA collaborated to compare approaches, tools, workflows and governance practices used to review student assessments for signs of AI use.
The initiative brought together educators and assessors, quality teams responsible for compliance, and learning design teams updating guidance and assessment frameworks across both training providers.
The goal was to develop and share consistent frameworks, practical tools and workflows that would increase educator confidence in reviewing assessments, strengthen collaboration between quality and learning design teams, and maintain assessment integrity in the age of AI.
To achieve this, the collaborative approach included:
- Structured knowledge-sharing workshops where both organisations compared existing frameworks, matrices and guidelines
- Demonstrations of toolsets used to identify AI-generated patterns
- Exploration of authenticity rubrics and decision trees to guide consistent assessment review.
The impact
The knowledge-share delivered immediate practical benefits for both organisations.
For systems and processes:
- Quality teams introduced consistent processes to review assessment inauthenticity risk across departments and faculties
- Adoption of refined assessment authenticity tools drawing on shared insights
For people and culture:
- Improved educator confidence when reviewing assessments with potential AI assistance
- Stronger alignment between learning design teams and compliance staff
- Increased clarity and transparency for learners about appropriate and inappropriate use of AI
The initiative reinforced that AI detection is not about policing students but supporting academic integrity and authentic skill development.
The process also uncovered new opportunities to update assessment tasks, strengthen instructions and broaden formative assessment use.
“Sharing tools and knowledge has strengthened the way we safely integrate AI into curriculum, while maintaining assessment authenticity.”
Alex Gilbey, Academic Director, Technology & Design, NextEd Group
What worked well
- The peer-to-peer format sparked open discussion and rapid improvement
- Real assessment examples helped ground the work in practical reality
- Both organisations benefited from discovering they were facing the same challenges, which normalised uncertainty and improved confidence
Challenges faced
- No single tool or method could detect AI with 100% accuracy, requiring both organisations to develop layered approaches rather than rely on technology alone.
- Educators initially approached the topic with differing levels of comfort and digital literacy, underscoring the need for shared learning.
- Balancing fairness to learners with compliance needs required careful communication and supportive frameworks, not punitive detection mechanisms.
Key insight: Assessment integrity in the age of AI is best strengthened through shared practice, not isolated efforts.
Next steps
Both NextEd and TAFE SA will expand this work by embedding the shared assessment-review workflow into internal educator guides and testing improved assessment task design principles that naturally reduce AI-generated risks.
They will continue to refine the Assessment Authenticity Risk Matrix based on classroom experience and will share updated resources with other training providers participating in the Skills Accelerator.
Strengthen assessment integrity in your RTO
Download the practical tools NextEd and TAFE SA co-designed through the Skills Accelerator to strengthen assessment integrity in your RTO.
Resources include:
- Assessment Authenticity Risk Matrix, including risk rating framework, mitigation strategies, and ready-to-use template.
- Sample Assessment Authenticity Analysis, an example showing the matrix applied to real assessments.