CHRIS LOVEDAY, VICE PRINCIPAL, BARTON PEVERIL SIXTH FORM COLLEGE, UK
Introduction
In January 2024, Barton Peveril Sixth Form College (BPSFC) began exploring the integration of artificial intelligence (AI). Initially, the focus was on leveraging AI to support business functions and improve operational efficiency, before being expanded to include student engagement and marking. This case study examines the challenges and opportunities encountered by BPSFC as it navigated the complexities of AI implementation. The college’s experience provides insights into the ethical, practical and pedagogical considerations relevant to other educational institutions considering adopting AI technologies.
Implementation
BPSFC implemented a phased approach to AI integration. Initial efforts focused on business functions, before addressing teaching and learning solutions. The college made Barton AI (their own front end to a large language model or LLM) available to staff and has begun to pilot access for students. The college also established an AI working group. The college recognised the importance of red-teaming AI agents (which will be explained in more detail below), and began to trial Barton AI with first-year EPQ (Extended Project Qualification) students and the students from selected tutor groups.
Background and initial exploration
BPSFC recognised the potential of AI to enhance efficiency, automate routine tasks and solve complex problems early. The college’s senior leadership team (SLT) met in January 2024 to discuss the implications of AI across the college, including business operations, cybersecurity, teaching and learning. An area of particular interest was its application to tasks that were repetitive, high-error tasks and those that often had declining rates of productivity. BPSFC also felt that addressing these areas would enhance their ability to staff their support services – roles that were vulnerable to retention issues as a consequence of the private sector offering higher rates of pay for roles at the lower end of their pay scales. The college decided to first explore AI solutions for business functions; the rationale behind this decision was that tasks across the business services teams were repetitive tasks (which often had decreasing productivity rates, with increasing error rates) and as such presented the ‘quicker wins’. The goals included reducing staff workload, improving staff and student experience, and ensuring value for money.
BPSFC began to explore the market for external consultants to support them in the delivery of the project and achieving their aims. They engaged with 13 specialist companies to conduct ‘deep dives’ into their business functions, before finally settling on two external companies; both were appointed to explore AI-based solutions. The first focused on creating bespoke ‘AI agents’ by adapting existing AI tools. The second examined ‘Google-based’ solutions, including moving systems to Google Cloud and piloting Google Gemini across the entire college. Both companies carried out independent deep dives of the business services. This case study primarily focuses on the first deep dive and the subsequent creation of bespoke agents. The deep dive revealed 107 functions where AI could make a strong impact; these 107 functions could be delivered through the creation of 26 bespoke agents.
Development of AI agents
Through its partnership with the external consultant, BPSFC began developing a range of AI agents, including:
- Barton AI: This is a bespoke front end to an LLM, built within the college’s Google tenancy, which allows the college to implement its own safeguards and prompts to prevent misuse and bias. This also eliminates the need for costly monthly licences, while ensuring GDPR compliance. For example, staff can put class sets of data into the LLM for analysis, while ensuring that the sensitive data never leaves the college’s digital domain, and that it isn’t being used to train any external models.
- Barton Buddy: This is a digital assistant for students, providing information and guidance from a range of sources. These include Google Classroom, Sites, Docs, Slides, Gmail, Workspace and more, as well as API (application programming interface) to the management information system (MIS) and the local bus company’s live bus data. This agent performs as a ‘one stop shop’ for students, offering 24/7 guidance on college life, including low-level health and wellbeing support, signposting students to immediate internal and external support.
- Peveril Assistant: This is a digital assistant for staff, just like Barton Buddy for students, providing information and guidance from a range of sources. These include policies, procedures, the college’s infrastructure and more. Peveril Assistant also includes the API access to the MIS.
BPSFC has developed 19 agents in total, to improve efficiency, reduce human error and address repetitive tasks. These include agents to enhance the enrolment process (like a GCSE results agent, which pre-populates the MIS before being reviewed by a human), a marking tool that provides both student feedback and a mark, and a certificate agent that cross-checks details on the exam board certificate with those in the MIS.
By May 2024, eight AI agents were ready for demonstration to staff. All agents built for BPSFC are fully tested in a ‘sandbox’ site (a testing environment that is isolated from a live system, which allows users/developers to experiment, test new features or practise without affecting the actual operational environment) by staff and students selected to be part of the ‘red team’. The aim of red-teaming agents is to find flaws and vulnerabilities in an AI system. This is done in a controlled environment (the sandbox site) in collaboration with the AI’s developers. Each agent is ‘red-teamed’ for a period of weeks, until such time as the college and developers are happy with the performance of each agent.
Challenges and risks
Like all who embrace this new technology, BPSFC faced several challenges and risks within its AI implementation, which the college recognised and addressed proactively:
- Ethical considerations:
-
-
- Bias and fairness: The college acknowledged the risk of AI systems perpetuating biases if trained on data reflecting social inequalities. This concern was addressed by building all agents within its own Google tenancy so that it could implement its own constraints and prompts to try to avoid misuse and bias. One such control implemented was the ‘100% accuracy’ prompt built into the front, meaning that if the agent is not certain of the output being correct, then it does not offer a response. Instead, it is programmed to reply ‘I do not have the accuracy to help with that question’.
-
- Digital equity: The college noted that not all students had equal access to AI tools, potentially widening the digital divide. This concern was addressed by developing both Barton Buddy and Barton AI as resources to be made available to all students free of charge. As both agents run on API access to an LLM, the necessity for individual licences is removed, replaced instead by much smaller ‘token’ costs, charged for what has been used.
- Consent: Most of the API access to LLMs requires parental consent for users under the age of 18. This presents a challenge in terms of addressing parental concerns around the use of AI and then recording consent in a way that grants access to the relevant agents. The often negative media coverage about the misuse of AI and the potential risks around data security cause parental concerns, which can be compounded by a broader lack of understanding of what generative AI (GenAI) is and how it works. These needed to be addressed through clear communications with explanations. BPSFC set up an ai@ email address so parents can ask questions that they would like clarified.
- AI literacy: Many students associate all AI applications as being LLMs. It was only through launching Barton Buddy to students that it became clear that many young people did not understand what GenAI is. Most users thought that Barton Buddy was itself an LLM, with access to the internet (despite it being made clear that it couldn’t do that). Applications like Snapchat AI and, more recently, Meta AI, both of which are LLMs, helped to compound the confusion.
- Transparency and accountability: The college recognised the potential for a lack of transparency due to ‘black box’ AI algorithms and the importance of establishing clear lines of responsibility. (Black box AI algorithms are AI systems where the internal workings are hidden or not easily understood, even by the developers who created them. They make decisions using complex processes that are difficult to interpret or explain.) Through the development of their own agents and the implementation of additional safeguards and controls, BPSFC was able to mitigate this risk. In addition, unlike commercial LLMs and tools, all of the BPSFC tools record the input (question asked) and, more importantly, the output (answer given), providing clear transparency.
- Data privacy and security:
-
-
- The college noted that AI systems require access to personal data, and thus the need to ensure the security of that data. BPSFC ensured that AI tools used with college data adhere to GDPR standards, including conducting DPIAs where appropriate and training staff on data safety.
- Barton AI was designed not to produce images or videos, to further restrict potential misuse.
-
- Practical challenges:
-
-
- The college was aware of the costs associated with implementing AI solutions, including licensing, training and infrastructure.
- The college also recognised that some areas, like timetabling, were more complex to automate and required bespoke AI solutions.
-
Opportunities and benefits
Despite the challenges, BPSFC also recognised several opportunities and benefits of using AI in education:
- Enhanced efficiency and reduced workload: AI agents like the GCSE Results Agent and Certificate AI have significantly reduced the time spent on manual tasks. For example, the GCSE Results Agent saved 93 labour hours during enrolment, and Certificate AI saved 300 labour hours annually.
- Improved student experience: AI tools like Barton Buddy provide students with easy access to information, including timetables, attendance and grades. The college has also used AI tools such as their marking tool (or the commercial tool Brisk) to provide tailored explanations, feedback and resources, helping students to grasp complex concepts and improve their understanding.
- Enhanced teaching practice: Staff have access to tools like Brisk and the BPSFC marking tool to enhance teaching and learning. AI tools have also supported the development of revision skills and retrieval practice. The college is using AI tools to create resources (through Google Gemini and Google Gems) and to provide student feedback.
- Digital literacy: BPSFC has developed resources so that students are being educated on what GenAI is, how to use and evaluate AI technologies, and how to understand their ethical implications and limitations. The college is working with external stakeholders to develop ‘AI driving licences’ for students: a provisional licence that covers the very basics and a full licence that is far more comprehensive in its content.
- Future readiness: Students are developing digital literacy and skills essential for the future workplace, where AI will play an increasingly important role. BPSFC has become the first UK ‘Gemini AI Academy’, providing staff with regular access to training and with all staff having a premium Google Gemini account.
- Accessibility and inclusionAn approach where a school aims to ensure that all children are educated together, with support for those who require it to access the full curriculum and contribute to and participate in all aspects of school life: By providing an LLM, the college aims to ensure that all students can benefit from these technologies and reduce the digital divide. In addition, the agents are developed to be multilingual, with a voice-to-text function and both light and dark mode, to support the accessibility of the tools to a wide audience.
BPSFC is currently developing an ‘AI driving licence’ to inform students of what GenAI is, the ethical considerations, its history and its evolution. The college held ‘AI inset days’ in June and November, providing training and development to staff on how to use GenAI, both in and out of the classroom. In addition, there have been regular optional training sessions for staff to attend as part of their continuous professional development (CPD), exploring the use of tools like Gemini, Gems and Notebook LM – these sessions have regularly been oversubscribed.
Conclusion
BPSFC’s journey with AI provides a valuable case study for other educational institutions exploring AI integration. The college’s approach has been proactive and considered, balancing the opportunities and benefits with ethical and practical challenges. By embracing a responsible and iterative approach to AI, Barton Peveril is positioning itself to provide a more personalised, efficient and future-ready educational experience for its students, and is contributing to the understanding of how AI may be integrated into the educational landscape.
Next steps
Following the successful development and integration of their agents, BPSFC is now focusing on further developing their marking tool to reduce teacher workload and improve marking standardisation. They have also commissioned the development of a new agent to reduce the administration burden around SEND, which will be red teamed in the spring term and go live in September 2025.
The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for EducationThe ministerial department responsible for children’s services and education in England or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .