For decades, many systems operated in a world of information scarcity. Schools delivered content, tested recall, and rewarded routine accuracy. That made sense when work rewarded memory and repetition.
That model is now under strain. Generative AI can draft essays, explain concepts, and solve standard problems quickly. UNESCO has warned that education systems need guidance and capacity so AI use remains human-centred and safe.
Why recall-heavy learning now under-delivers
When students can retrieve and generate answers in seconds, recall becomes a weaker proxy for learning. If we keep assessing what AI can do easily, we teach students to outsource thinking. Over time, that can dull judgement and reduce confidence.
The risk is not only academic. It is cultural. If students see school as pointless, motivation drops and trust erodes.
AI saturation brings both opportunity and risk
AI can help schools remove friction. It can also widen gaps if leaders do not set clear expectations and guardrails.
Opportunities worth using well
Used wisely, AI can free time for the work that matters most:
- faster feedback cycles, with teacher oversight
- better differentiation and support for varied needs
- reduced admin load, so teachers can mentor and coach more
UNESCO’s work on AI in education frames this potential as part of a wider push to support teaching, learning, and assessment, while keeping values and rights in view.
Risks leaders must name directly
The risks are practical and ethical:
- unequal access across schools and families
- weak data privacy and unclear tool quality
- academic integrity confusion, if rules are vague
- reduced human connection, if screens replace relationships
UNESCO’s guidance on generative AI highlights the need for immediate actions, policy development, and human capacity so adoption does not outpace safety and ethics.
Redefine the purpose of schooling around the human core
If AI can produce content at scale, education must shift its centre of gravity. The purpose becomes human development, not content delivery.
Human-centred AI in education prioritises what technology cannot replicate well:
- Creativity and original thought: new ideas, fresh links, and real invention.
- Ethical judgement: weighing trade-offs and values in messy situations.
- Empathy and relationships: care, trust, teamwork, and belonging.
- Critical evaluation: judging quality, bias, and truth in outputs.
- Adaptability: learning, unlearning, and relearning with confidence.
This is not anti-technology. It is pro-human learning, with AI as a tool.
Practical steps for educational leaders
Leaders translate purpose into practice. Start with moves that change daily routines, not just policy documents.
1. Audit curriculum for “human work”
Identify where AI can automate routine tasks. Then reallocate time to discussion, inquiry, performance, and reflection. Protect space for deep reading, debate, and problem solving.
2. Invest in teacher capability, not just tools
Train staff in safe AI use, but also in teaching the human skills above. UNESCO’s AI Competency Framework for Teachers sets out knowledge, skills, and values across areas such as human-centred mindset, ethics, pedagogy, and professional learning.
3. Update assessment so process matters
Shift towards assessment formats that reveal thinking:
- oral defences and viva-style questioning
- project work with drafts, critique, and reflection
- in-class performance tasks and practical products
- annotated evidence of decision-making
This reduces the payoff of copy-paste work and increases the value of judgement.
4. Build shared norms for responsible use
Create a simple, school-wide AI agreement. Co-design parts of it with students. Be clear on what is allowed, what must be declared, and what is not acceptable. Revisit it as tools evolve.
UNESCO also provides an AI Competency Framework for Students that supports safe, meaningful engagement with AI through defined competencies.
5. Protect relationships as the learning engine
Make sure AI use increases time for human connection. Guard the basics: greeting students, conferencing with writers, coaching groups, and building belonging.
6. Monitor equity and access
Track who is benefiting and who is being left behind. If access varies, design school-based solutions so AI does not widen gaps.
A simple starting point for this term
If you want one concrete beginning, try this three-step approach:
- Redesign one assessment to value reasoning and voice.
- Run one staff session using a shared AI use protocol.
- Agree one non-negotiable about relationships and learning time.
Human-centred AI in education is not a future project. It is a leadership choice made in daily decisions, meeting by meeting, task by task.
What is the purpose of education in your school right now? If AI removed every routine task tomorrow, what would you want teachers and students to spend their time doing?
Want more insights? Read Collective Responsibility in Education: Be the Solution or Reading The World in School Leadership
Discover more from Dr Jake Madden
Subscribe to get the latest posts sent to your email.