The importance of student voice in getting AI right for schools

Written by: Daniel Emmerson
7 min read
DANIEL EMMERSON, EXECUTIVE DIRECTOR, GOOD FUTURE FOUNDATION, UK

School leaders are in a precarious position, whereby they are striving to ensure the best possible futures for their students whilst also being aware of the impact that artificial intelligence (AI) technology is already having on the way in which young people are adapting their study and their aspirations for the future (The Social Institute, 2023).

It is likely that AI will significantly impact the world of work (Forbes Technology Council, 2024), and students are already using AI tools with which school leaders are not sufficiently familiar to be able to make informed decisions over (Cheong, 2024). This article looks at several key considerations for schools about what best practice might look like, but through the lens of the student experience.

The activities of the Good Future Foundation are steered by two advisory groups, as well as a board of trustees. One of these advisory groups is made up of students – young people who are either in their final two years of secondary school or in their first two years of university. These people are eager, engaged and committed to making a difference as far as addressing digital disparity is concerned. As the students are contributing for the benefit of students who are much younger than they are, their perspectives are focused on the development of AI in educational institutions as they have seen it evolve over the past two years.

The student council presents a variation of world views, based on the geographic locations of their schools and their appetite for cross-cultural programming. They come from a broad range of countries, including Pakistan, Colombia, UK, Morocco, Argentina, India and Turkey, and bring their own unique perspectives to the initiatives that we undertake. The recommendations that they make to schools and to the foundation emerge from lived experiences across the world and from schools with a truly diverse level of access to resources and professional development.

These students believe that it is essential not only to involve AI in the academic process, but also for schools to consider the ways in which other members of the school community might be exposed to this technology. Students are starting from their own self-arrived perspective that AI is going to be front and centre of their educational journeys from school onwards, but they are also mindful of how it is going to be essential to their world of work in the future. Consequently, unless teachers and, indeed, operational staff have a firm grounding, not just in what tools and solutions can do but also in how they should be responsibly and effectively approached, then they are at a unanimous disadvantage (NVIDIA, 2024). Ultimately, they would like for schools to be shaping the path of what best practice looks like, not just in education but from a professional perspective as well.

The below recommendations come directly from students who use AI, albeit on a regulated basis.

Using AI tools for report writing 

As teachers have been able to see ways in which students might be working on tasks differently, following the mainstream implementation of ChatGPT, students have also noted ways in which their teachers might approach administrative tasks. They have noted a particular trend when taking into account the content of their reports and what they are seeing about their progress in classes. Report writing is a standard practice across the board, regardless of where the students might be located; from Casablanca to Buenos Aires, teachers report on student progress, attitude and behaviour throughout the academic year, in order to monitor development and make recommendations.

This has not always been the best approach to take, and many of the students’ schools complement report writing with teacher–parent–student conferences, where open discussion and dialogue enables a more fruitful depiction of the term or semester. However, when report writing shifts from a developmental initiative that puts the child first, to a bureaucratic exercise that can be cut down by using an AI tool, students notice.

The recommendation here is not to stop using AI to address bureaucratic exercises at school. The student council believes that this should be encouraged, with the right guidance in place. Therefore, using an AI tool to give the teacher prompts concerning what they might need to consider for their students, could be a beneficial alternative. The example that they provide is to think about the most beneficial things that a student might need to know about their performance over a period of time and what they might need to do better.

Using AI to generate prompts in this regard means that the teacher can create a framework through which to address each student uniquely, without the use of automation. Using AI for the teacher to write student reports, on the other hand, is an indication of a somewhat less desirable future, wherein AI is solely responsible for who gets what (Harari, 2024). This is the thing that students fear the most about an AI-centric future.

Using AI tools for risk assessment

This was not a task that students came to on their own terms. Risk assessment as a process looks very different in different parts of the world, and I’m taking a very UK-centric approach to the definition here. However, the idea of ways in which AI might be used to ensure a more robust system, whilst at the same time saving teachers time and allowing more experiential learning, was something that students championed immensely.

The risk assessment framework can be generated by AI, and so can the vast majority of the content that populates certain fields. Basic information can be written and populated with a few keystrokes, rather than having to spend time laboriously rewriting documents for school activities. The students’ suggestion was to help teachers and administrators to think more creatively and actively about risks that might be associated with an activity. They could use any existing policy or government guidance as a benchmark and then use an AI tool to test how robust the activity might be, to detail whether or not there are risks that might not have been considered and, most importantly, to get the assessor to engage with the process of writing the assessment more critically.

As with report writing, if risk assessments become a bureaucratic check-box exercise with no intrinsic merit, then they become a means of pushing responsibility and limiting the scope of in-person activities with which young people might engage. AI may be deployed to reframe this existing paradigm, firstly by setting up the type of rubric that encourages suitable levels of risk (Charon, 2020), whilst also testing the assessor on their ability to think through practical and implementable prevention measures.

Eroding the zone of proximal development 

Regulating the use of AI has now become part of the regular agenda for students when it comes to thinking through what best practice looks like. It is so fully augmented into their day-to-day lives, through the applications that they use, the processes that they put in place to complete work and the ways that they might use it for engaging in the world outside of school, that they are finding it difficult to retain the same levels of focus that they might have done prior to its adoption.

That adoption looks very different across the board, from students prioritising professional AI accounts above anything else for which they might have an allowance in any given month, to students who are keen to adopt and experiment with as many free or trial versions of a tool as they can find.

It’s taken two years of this level of use for students in the council to realise, independently, that this is having an impact on their cognitive performance and how they are able to function whilst studying. Some – though not all – of the council talked through self-imposed regulation because of the noticeable impact that it is having on their zone of proximal development. One student referred to this specifically as ‘self-identified brain rot’ from not having to think through tasks and complex problems in the same way that they had been doing prior to the use of AI tools.

What the students are interested in is how teachers might help by guiding them through what best practice might look like. At the moment, students do not consider teachers to be well-enough equipped (Vodafone Foundation, 2025) to teach them adequately. In order for students to feel confident that their teachers are able to at least learn with them in this respect, they need to see teachers adopting and using, with thought and care, the tools that are most likely to have an impact.

Conclusion

There is a notion that the use of AI in the world of teaching is not desirable, or that it deviates so much from standard practice that its use should be concealed and hidden from view. What we are hearing most at the moment from students is that it would be beneficial for best practice AI technology use in schools to be brought to the forefront and to be analysed and demonstrated with students as part of a collaborative process.

Whether this is through report writing or assessing risk, there are plenty of cases where teachers might be able to demonstrate to their class how they are using the technology to the advantage of the class, and indeed to the school.

This is a phenomenally exciting time to be a teacher, but so much hinges on getting these next steps right. I am advocating for the voices of students to be heard and considered in the process of this important decision-making.

The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for Education or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .

    0 0 votes
    Please Rate this content
    Subscribe
    Notify of
    0 Comments
    Oldest
    Newest Most Voted
    Inline Feedbacks
    View all comments

    From this issue

    Impact Articles on the same themes