Meena Kumari Wood, RESEARCHER, AUTHOR AND FORMER HMI (OFSTEDThe Office for Standards in Education, Children’s Services and Skills – a non-ministerial department responsible for inspecting and regulating services that care for children and young people, and services providing education and skills), UK
The potential impact of AI on students’ ‘soft skills’
A recent CIPD survey revealed that a third of 16–24-year-olds lack employability skills, including resilience, problem solving and communication skills (CIPD, 2024). Almost half of respondents felt that they were never taught these skills at school (CIPD, 2024). Business leaders agree that oracy and listening skills are more important than ever as AI tools begin to take on the cognitive load of mundane tasks. (CIPD, 2024). A recent report from the Oracy Education Commission recommends oracy as ‘a foundational building block’ and recognises that ‘In this age of the robots, we want pupils to be ready to excel as human beings’ (Oracy Education Commission, p. 5). It is clear, moving forward, that employers and higher education (HE) institutions will continue to require young people to demonstrate the human skills not easily replicated by machines; these include emotional empathy, communication and team work.
The reality of AI use in schools
Research from The Internet Matters Team (2024) found that over half of 13-14-year-olds use AI tools to help with or complete their school work. However, nearly half of children believe that AI will be beneficial to their education compared with just one third of parents (Internet Matters Team, 2024). The research warns that lack of official guidance is leaving schools and many parents in the dark about the potentially profound impact of AI on homework and classroom learning (Internet Matters Team, 2024). Two thirds of parents have not been informed about how their child’s school plans to use generative AI tools for teaching (Internet Matters Team, 2024). Without relevant safeguards and training for teachers and children, how do schools know when work children submit is their own, or AI-generated?
According to survey results from Teacher Tapp (2024), overwork leading to teacher shortages and reduced planning time means that AI is potentially an essential tool for supporting teachers with day-to-day tasks that contribute to workload. Furthermore, AI tools can assist neurodiverse students through structuring workflows, study strategies and revision plans, making an overwhelming task more manageable (Teacher Tapp, 2024). Teachers are then free to adapt their practice to support students who might struggle with comprehension tasks.
AI and adaptive teaching
It can be argued that the all-purpose approach of teacher talk doesn’t account for individual differences and abilities in students and is not an inclusive pedagogy. This can often lead to some students falling behind, while others become disengaged due to a lack of challenge. AI can support a flipped learning approach, whereby students can independently access knowledge using AI tools, either at home or in class. This frees up time for teachers to interact meaningfully with students on applying that knowledge during the lesson. This is where AI comes in as a useful additional tool, not a replacement. Teachers can use AI tools to generate customised active learning experiences in order to make classes more relevant and engaging to students. AI tutors could be used to tailor instruction to each student’s unique needs whilst continually adjusting content based on performance. This means that students can engage with the content at home more effectively, ensuring they come to class better prepared and ready to dive into hands-on activities or discussions, thereby supporting oracy development.
At the very least, tools such as ChatGPT can bring knowledge to life with debate partners offering added nuance, multiple perspectives, alternative viewpoints and persuasive arguments, arguably helping to stimulate creative skills, literacy and oracy skills. In addition, the versality of Chat GPT means it can act as a language coach and enable practice sessions for foreign languages as well as English as an Additional Language. Yet these are educational experiences that are impossible to envision without all-important teacher input, evaluation and feedback.
What could a policy on safe use of AI look like in schools?
We have to acknowledge the fact that AI is rapidly evolving and its use in education is becoming more widespread. Interestingly, this shift has huge ramifications for how and what Ofsted inspects, with the focus moving more towards learning and learner attributes, rather than rote learning (p.29, Kumari Wood, 2025). ‘Evolution not revolution’ (DfEDepartment for Education - a ministerial department responsible for children’s services and education in England and Phillipson, 2024) is the Curriculum Review’s mantra. Teaching ‘soft skills’ (skills like communication, critical thinking, reasoning and problem-solving) together with training and support for practitioners and students in the classroom is a priority, so that AI can evolve as friend and not a foe. What is needed is a government policy that shapes school policy on the safe and effective use of AI in education. Undoubtedly, the main principles of responsible AI development are privacy and data security. Generative AI models are trained with vast amounts of data extracted indiscriminately from the internet, which often contain personal data, so transparency, attribution and addressing bias are key.
As a starting point, a school policy could include the following considerations:
- Support learning goals: Ensure that any integration of AI tools supports and enhances the school’s curriculum objectives. AI should be a supplemental resource that promotes personalised learning, fosters critical thinking and enriches the educational experience, whilst upholding the integrity of the role of the expert teacher.
- Manage risks and privacy: Prioritise safeguarding by addressing the potential risks associated with AI, such as deepfakes, impersonation and misuse of AI tools. Policies should ensure compliance with GDPR and all other data protection regulations.
- Act transparently: Maintain clarity about where, when and how AI tools will be used within the school, ensuring that all stakeholders, including parents and learners, are informed. Staff should take responsibility for the quality and accuracy of any AI-generated content or feedback used in teaching or assessment, and should label AI-generated materials within lesson plans and other school activities
- Respect ethical standards: Emphasise the importance of ethical AI use, including active avoidance of bias, respect for intellectual property and promotion of fairness and inclusivity. Establish protocols to ensure that AI tools align with these ethical principles before being adopted. Periodic reviews of AI tools to identify and address any potential biases or ethical concerns should invite feedback from learners and staff
- Train and monitor: Provide staff with the necessary training and ongoing support to use AI effectively and responsibly, in a way that complements their professional expertise. Monitor AI’s impact on teaching, learning and administrative tasks, and adapt practices based on outcomes and feedback.
Conclusion
A blanket ban on AI will not help children and young people to acquire the soft skills they need for their futures. Undoubtedly, AI is highly prone to factual errors or misinformation. Critical and digital literacy skills, if structured into the curriculum, can enable students dealing with information overload to sift through knowledge, analyse, cross-reference, and evaluate it. Integrating AI as a valuable technological tool, alongside other modes of learning, allows for an iterative and reflective approach. Learning with AI is only going to be truly effective if embedded across the curriculum, with time built in for students to learn how to use it effectively and safely, and in conjunction with the development of other key skills.
The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for EducationThe ministerial department responsible for children’s services and education in England or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .