JON CRIPWELL, NATIONAL EDUCATION LEAD, PRIMARY MATHS AT TWINKL EDUCATIONAL PUBLISHING, UK
Can AI (artificial intelligence) really help children who struggle in maths? It’s a question I’ve heard increasingly often in recent months, as generative artificial intelligence (GenAI) has hit the headlines and started to become integrated into our everyday lives. Usually, behind the curiosity lies a dual tension: excitement about AI’s potential to personalise learning, and caution about its limitations, particularly for learners with additional needs.
As a former senior leader of a primary school and current National Education Lead for Primary Maths, I’ve seen how powerful it can be when pupils with Special Educational Needs and Disabilities (SEND) experience support that feels genuinely attuned to how they learn. When that connection is missing, pupils can sometimes become disengaged or convinced that maths ‘just isn’t for them’. Yet for teachers, planning this kind of support, particularly in busy mainstream primary classrooms, remains a challenge. The potential of AI to adapt in real time, generate bespoke resources, or present content in accessible formats offers a compelling vision of more inclusive practice. Early case studies suggest that when maths materials are tailored to pupils’ specific needs and interests, including autistic and dyslexic pupils, learners report improved engagement, confidence and understanding (Stavroulia et al., 2024).
However, the research also urges caution. AI tools are not neutral; they reflect the data they are trained on and the values of those who design them (Holmes & Tuomi, 2022). Concerns around bias, transparency and oversight are particularly pressing when working with vulnerable learners (Göransson & Nilholm, 2014; Holmes et al., 2023). While AI can support teachers to notice and respond to pupil thinking, it cannot replace the complex judgment and deep subject knowledge that responsive, quality-first teaching demands (Boaler & Brodie, 2004).
So where does that leave us? Between promise and peril perhaps lies a space worth exploring: the ways AI might help teachers make maths more inclusive, not by automating the complex work of teaching, but by providing adaptable, context-aware support that reflects the principles of inclusive practice and strengthens effective pedagogy.
Rethinking inclusionAn approach where a school aims to ensure that all children are educated together, with support for those who require it to access the full curriculum and contribute to and participate in all aspects of school life
Inclusive education may be understood in narrow terms as ensuring that pupils with SEND are integrated into mainstream classrooms. But true inclusion is rooted in how teaching is designed and delivered, and in whether the learning environment enables all pupils to participate meaningfully, access the curriculum and experience success.
Göransson and Nilholm (2014) outline four broad ideas of inclusion, ranging from placement-focused models to those that prioritise equitable, participatory communities. Furthermore, Florian and Black-Hawkins (2011) argue that inclusive pedagogy is not about narrowing the curriculum for some learners, but about extending what is ordinarily available to all. These broader conceptualisations frame inclusion not as a special intervention for a few, but as a commitment to equity and excellence for all.
This aligns with the principles underpinning teaching for mastery in maths. Mastery approaches reject fixed-ability thinking and emphasise the belief that all pupils can learn maths with the right support. As Boylan et al (2019) observe, mastery has the potential to reduce attainment gaps, but only when implemented in ways that remain responsive to individual pupil needs. This means carefully selected tasks, appropriate scaffoldingProgressively introducing students to new concepts to support their learning and high expectations for all learners.
The National Centre for Excellence in the Teaching of Maths (NCETM) reports that, as of August 2024, 85.5 per cent of open primary schools in England have engaged with the Maths Hubs Programme since its inception in 2015–16, with nearly 70 per cent participating in the Teaching for Mastery pathway during that period. This widespread adoption reflects a national commitment to approaches that align with inclusive pedagogies; approaches that assume all pupils can succeed in maths, given the right structures and support.
However, the 2023 OfstedThe Office for Standards in Education, Children’s Services and Skills – a non-ministerial department responsible for inspecting and regulating services that care for children and young people, and services providing education and skills report Coordinating Mathematical Success cautions against equating uniform curriculum delivery with true inclusion. It notes that when some pupils with SEND are given the same curriculum as their peers, often with the best of intentions, it can create an ‘appearance’ of inclusivity. This approach may avoid meaningful learning rather than secure it.
Inclusive maths teaching, then, is not something separate to effective pedagogy; it is the litmus test of it.
Personalised, not prescriptive: What AI could offer
The promise of artificial intelligence in education often hinges on a single word: personalisation. In theory, AI tools can respond on an individual basis to each pupil, offering tailored explanations, scaffolded practice and adaptive feedback. In a subject like maths, where new learning depends on secure prior knowledge, this kind of responsive support could be particularly powerful. For learners with SEND, difficulties with working memory, processing speed, or language can make it harder to keep pace, especially when gaps compound over time. AI may provide an additional way to help identify and address those gaps at the earliest opportunity.
Case studies already suggest potential. In one small-scale study, ChatGPT-3.5 was used to generate worksheets for two 13-year-old pupils, one dyslexic, the other autistic, incorporating personal interests into the learning objectives. Both students reported greater enjoyment and confidence, and the teacher observed increased participation and retention of concepts (Psyridou et al, 2024). While certainly limited in scale and context, the study illustrates how AI might foster not only understanding, but also a sense of connection to and engagement with the learning.
Scaling support, not simplifying learning
Some AI systems can adapt in real time to learners’ responses. These tools detect misconceptions, provide scaffolding and adjust content dynamically. In a meta-analysisA quantitative study design used to systematically assess the results of multiple studies in order to draw conclusions about that body of research, VanLehn (2011) found that well-designed intelligent tutoring systems can yield learning gains approaching those of human tutoring, particularly in the procedural aspects of maths. For learners with SEND, where cognitive load may impede fluency, the ability to proceed at their own pace with responsive feedback can offer an advantage (Koedinger & Aleven, 2007).
AI can also help reduce a persistent barrier to inclusive practice: preparation time. Teachers often cite workload as a major challenge when adapting materials (Luckin et al, 2016). Recent studies highlight the significant workload challenges faced by teachers in England. According to the Department for Education’s 2023 Working Lives of Teachers and Leaders survey, full-time teachers reported working an average of 52.4 hours per week, with leaders averaging 58.2 hours. This excessive workload not only impacts teachers’ wellbeing but also affects their capacity to provide vital personalised support to pupils. Generative tools can rewrite questions, simplify language and create scaffolded versions, supporting access while preserving teacher time for instruction and feedback (Demo et al, 2023; Holmes & Tuomi, 2022).
It is, however, important to distinguish between responsive and prescriptive personalisation. The former empowers teachers to understand and respond to pupils’ thinking; the latter risks boxing learners into fixed pathways based on limited data. As Holmes et al. (2022) warn, AI systems are only as inclusive as the assumptions they are created with. If not carefully designed, they may reinforce disadvantage by adapting to patterns of performance without recognising the learner’s context or potential.
Done well, AI does not replace teacher judgment; it extends it. It supports variation, enables formative assessment, and offers alternative ways in. As one of NCETM’s Five Big Ideas of Teaching for Mastery (NCETM, 2023), variation helps learners access key mathematical ideas in meaningful ways. AI tools that support this can enhance inclusion.
Limitations and risks
While the potential of AI to support inclusive maths teaching is clear, so too are its limitations.
First, there is bias. AI systems are trained on datasets that reflect dominant cultures. As Cukurova et al. (2023) note, many models are trained on “WEIRD” populations: Western, Educated, Industrialised, Rich, and Democratic, which can marginalise students from different linguistic, cultural, or neurodivergent backgrounds. For example, if a chatbot struggles to interpret non-standard syntax or alternative problem-solving, it may misdiagnose a pupil’s needs.
Bias also operates through algorithmic assumptions. Holmes and Tuomi (2022) argue that many systems adopt an ‘instructionist’ model, prioritising linear progress, which may fail to capture the relational nature of learning seen in inclusive pedagogies. This results in narrow definitions of success that overlook varied demonstrations of understanding.
For instance, a pupil might solve a problem using an unconventional method, yet an AI system might flag this as an error. Without professional interpretation, such systems risk penalising differences.
Privacy also demands thoughtful attention. Many AI tools collect detailed performance data. Without boundaries, these systems risk over-monitoring, especially for learners with SEND. In classrooms where trust and safety are essential, AI must protect pupils’ dignity as well as their data.
Over-reliance is another potential pitfall: if AI outputs are treated as infallible, teachers may defer to them uncritically. As Williamson and Eynon (2020) note, many systems function as ‘black boxes,’ lacking transparency. In inclusive education, where contextual understanding is vital, this opacity can be problematic.
A further concern is the impact on learning as a social activity. If pupils with SEND are consistently accessing their learning through devices, while others engage with shared maths experiences and the benefits of a dialogue-rich environment, we risk reinforcing exclusion in the name of inclusion. Technology should support participation, not separate it.
And fundamentally, even the most advanced tools cannot replace the professional expertise required to notice cues, adapt explanations or build trust with a child. If AI is used to deskill educators, we risk undermining the pedagogical flexibility that effective SEND support demands.
A human-in-the-loop approach
If AI is to play a meaningful role in inclusive maths education, we need more than better algorithms; we need better questions. Who is this tool for? What assumptions does it make? How might it amplify, rather than erode, teacher judgment?
Inclusive pedagogy begins with the belief that all children can learn, and that diversityThe recognition of individual differences in terms of race, ethnicity, gender, sexual orientation, socio-economic status, physical ability, religious beliefs and other differences is not a problem to be fixed, but a resource to be mobilised (Florian & Black-Hawkins, 2011). Taking this view, AI is another scaffold, a tool that, like a textbook or number line, helps pupils access mathematical ideas. Its role is to support teachers to notice, interpret, and respond to how learners engage.
This demands a human-in-the-loop model, where AI tools are integrated with the teacher, not the technology, at the centre. GenAI can create resources, modify lesson plans or tasks to reflect a pupil’s interests, but it takes professional knowledge to judge their appropriateness. Like any tool, its value depends on how it is used.
We must also evaluate AI not just for technical performance, but for how well it reflects inclusive values. Do tools support collaboration, not just individualisation? Can they accommodate different ways of expressing understanding? Do they respect agency and protect data? These are questions that teachers and leaders must feel empowered to ask, and that developers and policymakers must be prepared to answer.
Inclusion is not something we can automate. But with thoughtful implementation, AI can become part of a wider toolkit that helps more children feel seen and supported, and helps every pupil, regardless of need, experience success and enjoyment in maths.
Tips for careful implementation
- Start with scaffolding: Use AI to generate scaffolded tasks, to simplify language, or to personalise contexts, but ensure that all pupils engage with the same core ideas. AI should extend what’s ordinarily available to everyone, not create separate tracks.
- Make workload reduction meaningful: Lean on GenAI tools to save planning time, e.g. modifying tasks or creating visual aids, but reinvest that time in responsive teaching: formative assessment, feedback and pupil conversation.
- Prioritise professional judgment over automation: AI tools can’t understand classroom nuance. Always evaluate outputs for bias, appropriateness and inclusivity. Avoid over-relying on ‘black box’ systems that lack transparency.
- Use variation to support conceptual access: AI can generate different representations or examples of the same concept. This kind of variation, when aligned with mastery principles, helps pupils to build deeper understanding.
- Keep AI use social: Inclusive learning is social and collaborative. If AI use isolates learners, it may reinforce exclusion. Design AI-supported tasks that still allow for dialogue, collaboration and classroom interaction.
The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for EducationThe ministerial department responsible for children’s services and education in England or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .