Artificial humanities: Can we maintain the ‘human’ in the humanities with AI?

Written by: Jack Wardle
7 min read
JACK WARDLE, TWINKL EDUCATIONAL PUBLISHING, UK

Defining the humanities is a difficult task, as the disciplines that it encompasses are vast and varied – from the more universal inclusion of history and geography to its potential cross-disciplinary inclusion of the languages, arts and the many subjects in between. Drees (2021) tentatively defines humanities as disciplines exploring human self-understanding and expressions in constructing their world.

Both this definition and the author highlight the complexity, subjectivity and interrelated dimensions of studying the humanities. Such terms require critical thinking, reflectiveness and an objective view of diverse perspectives. Concurrently, the field of digital humanities (the application of digital tools and methods to traditional humanities disciplines) is growing rapidly, with artificial intelligence (AI) playing a crucial role in in its evolution. This article considers:

  • How can AI tools enhance the design of humanities curricula, while maintaining the discipline’s emphasis on exploring human experiences, perspectives and cultural expressions?
  • In what ways might use of AI systems influence how humanities educators shape students’ understanding of the world and their place within it?

 

What are the gaps in research?

This perspective is given at a time when mainstream research, such as the recent Education Endowment Foundation (EEF) study on the use of AI by classroom teachers and leaders (EEF, 2024), is landing in schools considering the benefits of using AI. In this study, the EEF calls for further research on AI’s varied impact across subjects, given that their research specifically sampled Key Stage 3 science teachers.

From archives to access

When designing a primary or secondary history curriculum, teachers are faced with a two-fold problem: first, the ease with which expansive and diverse source material from which to choose can be accessed, and second, making source content itself accessible. For example, when studying Britain’s settlement by Anglo-Saxons, students may struggle, even when translated, in deciphering the archaic expressions and intricate sentence structures found in the Anglo-Saxon Chronicle, or the cursive handwriting and archaic terminology, abbreviations and code in Victorian-era census records when studying crime and punishment. The precursor to accessibility in the classroom is source material being both available and accessible to historians and archivists in the first place.

Historians and archivists are seeing changes in and a scaling up of both these barriers. Spina (2023) highlights the role of AI in enhancing and accelerating digitisation, making vast volumes of analogue material more accessible to historians. While digitisation has long been a process in historical archiving, AI contributes by automating and refining key steps, such as transcription (converting text into a digital format), error correction (fixing errors) and data normalisation (standardising the data for consistency).

Questions for curriculum design

  • What larger collections of sources, if available through AI-led digitisation, could enhance your humanities curricula?
  • How do we maintain student criticality when generative AI (GenAI) may transcend far beyond digitisation and into being able to critique, summarise, identify patterns and attempt to infer?
  • Could complex and sophisticated yet crucial and fascinating historical sources be transcribed and summarised using an AI system, personalised to the needs of younger and even individual learners’ needs?

 

Virtual journeys: Bringing the world to your classroom

Bringing the world into the humanities classroom is beginning to be made possible through the integration of AI with virtual reality (VR) and augmented reality (AR). Alazmi and Alemtairy’s (2024) research demonstrates that VR field trips themselves can boost academic achievement by encouraging exploration of historical events and social contexts. In this study, VR-assisted learning reduced cognitive load by presenting complex concepts through more easily digestible, multisensory formats. Students reported a heightened sense of presence – physical, social and self – leading to increased engagement and motivation with the content.

While this highlights the potential impact of VR on academic achievement – something that could be applied when designing humanities curricula – it does not specifically address the role of integrating AI into these experiences. Other studies indicate that combining AI with AR and VR can further enhance educational outcomes. For example, Lampropoulos (2023) discusses how AI is able to provide personalised learning experiences, real-time feedback and adjustments, through AI integration in these experiences in other fields of education. This leaves open the potential for highly accessible, inclusive, personalised and engaging learning experiences to be planned within humanities curricula.

Questions for curriculum design

  • What elements of your humanities curricula would benefit from VR/AR?
  • What support do educators need to effectively use these technologies?
  • What elements of humanities curricula are most difficult to adapt for learners with specific needs? How could VR, AR and AI support here?

 

Bias in the machine: Who’s telling the story?

While algorithmic bias is broader than AI, the rapid developments in this technology make it a crucial topic of which those designing humanities curricula should be aware. An AI system will respond based on the data for which it has been trained or has access. Where this data originates from and its expanse, poses questions with regards to which communities, cultures and perspectives will form part of its response. In exploring the term ‘AI colonialism’, Mollema (2024) points to the challenge of developers in ‘prohibiting AI deployment to continue along its colonial trajectory’.

Taking this into the classroom, Nyaaba et al.’s (2024) study investigates how GenAI could enforce Western ideologies on non-Western societies as a way of perpetuating digital neocolonialism in education through inherent biases. They note that GenAI systems, trained mainly on Western data, may generate content that reflects Western cultural references and examples, potentially alienating students from non-Western backgrounds. Again, this might marginalise non-dominant languages, since GenAI uses mostly Western-based languages. One suggestion of this study is increasing students’ awareness of identifying colonial bias in GenAI, tackling this risk head-on by using tools in the classroom as a source of discussion on bias itself.

Questions for curriculum design

  • In what ways can educators be trained to recognise and mitigate algorithmic biases when creating AI-generated curriculum content?
  • How explicit does a humanities curriculum need to be in involving students in discussions about AI ethics and biases, specific to the disciplines being studied?
  • What is your process for evaluating how marginalised voices are highlighted in our humanities curricula?

 

Breaking stereotypes: Is humanity represented through AI?

High-quality humanities curricula are dependent on accurate and diverse representations in order to make human societies and environments understandable; however, as educators delve into using AI to create educational content, the tendency towards generalisation in what AI generates poses a risk.

A study by Ye (2023) analysed how a GenAI platform would render urban street view images from different countries – an educational resource that could be useful in both primary and secondary humanities. The results highlighted the homogenous images generated by AI, focused on continental stereotypes, representing economic development and modernisation as very uniform. This highlights the risk of reinforcing existing biases and misconceptions through AI’s overly simplistic generalisations when developing educational content.

Generalised portrayals, if used in humanities teaching and classroom discussions of geography, history or culture, have the potential to mislead students, reinforce stereotypes and limit the global diversity of humanities curricula. Ye (2023) points to the potential benefits of GenAI in geography education specifically, while emphasising the importance of users staying mindful of possible biases and stereotypes.

Questions for curriculum design

  • How do we ensure that the diverse and often marginalised communities and cultures within regions are reflected? Does your curriculum avoid stereotypes?
  • How do you robustly check and evaluate this?
  • How can we explicitly address the tendencies of GenAI and its potential implications for studying the humanities?

 

Who holds the reins? Responsibility in curriculum design

Many of the studies cited here offer a call to action for educators and users of AI tools to be interrogative of outputs. As the use of AI sees exponential growth, there are attempts to bring an ethical framework to both the development and the use of tools. However, frameworks such as UNESCO’s ‘Recommendation on the ethics of artificial intelligence’ (2021) and the European Commission’s ‘Ethics guidelines for trustworthy AI’ (2019) are advisory rather than statutory, and are non-sector specific.

For the time being, those designing humanities curricula must take this responsibility themselves. By proactively embedding AI into humanities curricula, we can ensure that the ‘human’ remains central to these subjects.

Reflecting on your curriculum design:

  • To what extent are students equipped with an understanding of AI applications in your subject’s discipline, even if its current application is experimental?
  • How does your humanities curriculum cultivate students’ critical thinking skills regarding AI-generated content in your subject?

 

By considering these questions, educators can embrace a curriculum design that not only embraces technological advancements but also upholds the core values and purpose of humanities, remaining mindful of the potential challenges that may lie ahead.

The examples of AI use and specific tools in this article are for context only. They do not imply endorsement or recommendation of any particular tool or approach by the Department for Education or the Chartered College of Teaching and any views stated are those of the individual. Any use of AI also needs to be carefully planned, and what is appropriate in one setting may not be elsewhere. You should always follow the DfE’s Generative AI In Education policy position and product safety expectations in addition to aligning any AI use with the DfE’s latest Keeping Children Safe in Education guidance. You can also find teacher and leader toolkits on gov.uk .

    0 0 votes
    Please Rate this content
    Subscribe
    Notify of
    0 Comments
    Oldest
    Newest Most Voted
    Inline Feedbacks
    View all comments

    From this issue

    Impact Articles on the same themes