SAM LOVATT, SENIOR LECTURER IN PRIMARY EDUCATION, ST MARY’S UNIVERSITY, UK
Children and young people are surrounded by technology, with many of them (nine out of 10) owning a smartphone by the time they attend secondary school (Woodhouse and Lalic, 2024). A recent report from Ofcom (2024) found that 96 per cent of children between the ages of three and 17 go online regularly via a range of smartphones, tablets and laptops. With device ownership and use so high, would teachers be right to assume that because children and young people have grown up in a world where technology is readily available, they can use digital tools such as artificial intelligence (AI) successfully and ethically? The digital native narrative would assume so (Prensky, 2001). A digital native is someone who understands how to use digital devices because they use them regularly (Prensky, 2001). However, Müller and Goldenberg (2021) are right to highlight that exposure alone to a digital tool does not equip the user with the digital literacy skills necessary to use it successfully and ethically. OfstedThe Office for Standards in Education, Children’s Services and Skills – a non-ministerial department responsible for inspecting and regulating services that care for children and young people, and services providing education and skills (2022) supports this, arguing that teachers that hold a digital native view are placing a barrier in the way of children and young people developing digital literacy skills, as they assume knowledge and skills and choose not to explicitly teach and model them. Despite this, the digital native narrative is still prevalent in mainstream media and education (Mertala et al., 2024) and, as such, there are implications that will be examined below. As new technology such as generative AI emerges and becomes free to use on the internet, teachers need to be mindful not to fall into the digital native or AI native mindset, and instead consider a place for teaching about AI in their curriculum.
Why artificial intelligence should be part of the curriculum in all key stages
The Department for EducationThe ministerial department responsible for children’s services and education in England policy about AI in education (DfEDepartment for Education - a ministerial department responsible for children’s services and education in England, 2023) states that teachers should teach children and young people about emerging technologies so that they are prepared for changing workplaces. Teachers are required to explicitly teach children and young people about and how to use AI and generative AI tools. An assumption that, just because they have used the tools previously, they can effectively and ethically use them is dangerous. UNESCO (2024) reports that integrating learning about AI into curriculums is crucial to educating digitally literate students. Digital literacy is more than just learning how to be successful users of a tool; it is about being able to use the tool in a discerning and safe manner (Ofsted, 2022).
When considering how to teach about AI tools in the primary classroom, Yang (2022) suggests that teachers should consider what is age-appropriate and relevant for the children. When considering these tools, it is important that the choice made is relevant to the current curriculum. Developing this further, Sentance and Waite (2022) argue that teaching about AI promotes safe use, requires creative thinking and empowers students to be change-makers. They further propose the SEAME framework as a means of ensuring that children and young people are taught about a range of AI tools. The framework identifies four levels of AI that could be explored and taught about: social and ethical considerations, the applications of AI, how to train models, and engines.
Before teaching about AI, teachers should acknowledge some wider cautions about the tool. There are, of course, the widely written-about cautions, such as data protection, misinformation and ethical use (DfE, 2023; Lovatt et al., 2024; Ofcom, 2024); however, there are also wider concerns for teachers to consider. Illingworth (2023) raises an important issue surrounding digital equity and access to AI tools away from a school setting. It is important to consider that not all students will have access to the tools outside of the school, because of a lack of reliable internet access or appropriate devices. Therefore, schools need to ensure that there is opportunity for children and young people to engage with and learn about these tools in their curriculum. It is also important to consider the implications that these tools can have for the environment and views of sustainability (Selwyn, 2024). Engaging children and young people in discussions about the wider ethical impact of AI tools can support them in developing an informed and ethical view of AI.
What can teachers do?
Responsibility for teaching digital literacy and use of AI should extend beyond those who teach computing. To avoid holding a digital native mindset about the children and young people whom they teach, teachers should ensure that they are teaching about the tools and how to effectively use them. Consider how these tools can be modelled in front of the students, sharing whether they have been used in lesson preparation and encouraging the children and young people to use them for more than just information-gathering, such as through completing tasks such as revision or defining key vocabulary within a context. Engage the children and young people with discussions about the uses of AI and listen to their opinions on the tools.
While considering appropriate places for AI to fit into the curriculum, it is important to consider children and young people’s existing knowledge and applications of the tools. Findings from Lovatt et al. (2024) suggest that children and young people use the tools for a range of reasons; taking their current uses for the tools as a starting point can support engagement and progression in knowledge and skills. At the heart of challenging a digital native mindset is the prioritising of a digital literacy curriculum. This priority needs to extend beyond the computing teacher; consider how you can prioritise digital literacy within your subject and in ways that align with your curriculum.
AI tools have the potential to change the education landscape and revolutionise how students learn. If teachers want their children and young people to use these tools effectively, ethically and with confidence, they need to be taught to do so. Educators holding a digital native mindset present a barrier to children and young people developing digital literacy skills that they can use to support their learning and beyond.
The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for Education or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .