2025 Special Issue: Safe and effective use of AI in education

Published: June 2025
Guest Edited by:

This issue explores the theme of safe and effective use of artificial intelligence in education, with sections on:

  • How can AI support teachers and learners?
  • How can AI enhance whole-school approaches and professional development?
  • How will AI impact the role of the teacher?
  • How are schools using AI tools?
  • What does the future hold?

FROM THE EDITOR

Dr Cat Scutt MBE, Deputy Chief Executive, Chartered College of Teaching, UK

In Spring 2019, the Chartered College of Teaching published a special issue of Impact focused on EdTech, funded by the Department for Education (DfE). In the six years since then, EdTech use in schools has changed almost beyond recognition: firstly, due to the COVID-19 lockdown requiring teachers to move rapidly to online and blended learning models, and the resultant changing expectations from pupils and parents; and more recently, owing to the advances in generative artificial intelligence (GenAI) technology, and its widespread availability to the general public.

While many schools were already utilising online tools and, in some cases, remote lessons even prior to the CODID-19 pandemic, AI was not even mentioned in that 2019 issue of Impact. Although artificial intelligence technology was of course already being used in various ways within EdTech tools used by schools, as well as to power chatbots and other widely available technologies used by the public, it was the launch of ChatGPT in November 2022 that really marked the beginning of the significant impact of GenAI in schools, colleges and other education settings.

The rise of GenAI undoubtedly offers some exciting opportunities and applications for education – but it also comes with significant risks. While the technology itself is perhaps very different from tech tools used in education previously, in many ways the promises are not so different from those that were considered back in that 2019 Impact issue: potential for workload reduction and widened professional development approaches for teachers; increased engagement for students; novel approaches to assessment; opportunities for independent student learning and practice, and the development of new skills for careers of the future.

The risks, too, are not unfamiliar: concerns around ethics; academic integrity; a loss of the human and relational aspects of education; implications for cognitive development, and, as Neil Selwyn reminded us in his editorial for the 2019 issue, the risk of the Matthew Effect – the phenomenon whereby those who already have an advantage accumulate more advantages through access to novel technologies, and those with disadvantages become even more disadvantaged, across the gulf of the ‘digital divide’.

Widespread AI use has profound implications across early years settings, schools, colleges and other education providers. A particular risk lies in the very ‘newness’ of GenAI technologies, particularly considering the scale of their current use. The research into their impact, including over the medium and long term, is very limited, and we need to act with caution. This is why many of the articles in this special issue of Impact are so tentative in their conclusions; we can’t yet claim with any certainty how AI is and will impact teaching and learning, and teachers and learners, going forward.

But I believe it is incredibly important to be sharing and reflecting on research and practice around AI in education, drawing on what we do know, considering potentially useful applications and reflecting on the challenges and risks of which we need to be aware. For me, three key themes emerge from the articles in this special PDF issue.

The first of these is that safety, ethics and privacy issues must be at the heart of our decisions around AI use. The DfE has published a framework for AI product safety expectations in education, setting out detailed safety standards and technical safeguards to protect students. The DfE has also commissioned Chiltern Learning Trust and the Chartered College of Teaching to develop training materials on the safe and effective use of AI, due to be published later this year, with the opportunity for teachers to gain certification through the Chartered College of Teaching. Through this training and accreditation, teachers will have the opportunity to build and demonstrate their understanding and ongoing development of the tools and policies required to ensure that students and staff alike are kept safe whilst embracing the exciting new technologies available.

And we must, of course, ensure that we are following the latest guidelines and legal requirements for use of AI tools and that they are adopted within the framework of our schools’ broader technology and safeguarding policies, ensuring that any implementation aligns with ethical guidelines and any age limitations on use.

A new AI content store, jointly funded by the DfE and the Department of Science, Innovation and Technology, will also pool curriculum guidance, lesson plans and anonymised pupil work, so that AI companies can train their tools using high-quality input, in order to generate effective and useful content for teachers and pupils. It is vital that we consider not only the content that we use, but also the values and ethics that underpin the development of that content.

The second theme is that we must continue to look for and generate evidence about the impact of artificial intelligence technologies in education, both positive and negative. The DfE is commissioning work in this space too, including work around the effective use of assistive technology for pupils with special educational needs and disabilities (SEND), funding for the development and evaluation of new tools, and a pilot of an EdTech Evidence Board run by the Chartered College of Teaching. This will involve developing criteria based on the best available evidence and consulting with the profession so that EdTech providers can be encouraged to conduct robust research and evaluation, and teachers and leaders can be confident that they are choosing the right tools for their contexts and needs.

The third and final theme is that we must consider the role of AI tools in the context of conversations around teacher professionalism, recruitment and retention. As articles in this issue demonstrate, it appears that there are opportunities for teacher workload to be reduced through effective use of AI technologies to support lesson planning and assessment. But this requires up-front time for teachers to develop their skills and knowledge. There are also murmurings about whether these technologies might have the capacity to replace the human teacher in the classroom. Yet it is clear to me that at the heart of teaching is the deeply human relationship – the connection between teacher and student – and that highly effective teaching depends on the knowledge and skills of expert practitioners, and this will not, and should not, change with the advent of GenAI.

The examples shared in this special issue won’t necessarily be those that you consider appropriate for your setting, and neither are they examples of perfect practice. Where authors mention specific AI tools, this is for context only and does not imply endorsement or recommendation of any particular tool. You will need to make decisions based on the needs of your pupils and your own consideration of the technologies available. I hope, however, that the articles within this special issue inspire thinking, conversations and action. The sheer importance of awareness for educators of the potential challenges and concerns raised by widespread AI use is why we have, for the first time ever, produced a PDF issue of Impact that can be shared with all educators. We have also published a wider selection of articles in the open access online edition of this special issue, available on the Chartered College of Teaching’s platform, MyCollege. In our future content, we will continue to seek out new research and reflections on this topic so that we can maintain our awareness and progress our knowledge as a profession. I look forward to continuing the conversation.

This issue's articles