James Rogers, Research and CPD Lead, Teaching School Council SW, UK
Teaching should be a research- and evidence-informed profession, rather than a research- or evidence-led profession. The latter implies a passive engagement while the former implies an active engagement, which, I argue, comes from critical engagement, asking the right questions of research and understanding what it can and cannot offer.
There is much on offer to the profession in terms of research and ‘expertise’, but the system is ‘noisy’ and there are a wide range of organisations and individuals making bold claims about strategies and ‘interventions’ that ‘work’, some of which present conflicting messages. It concerns me that while we operate in this research-rich landscape, in my experience of regional school improvement some schools can focus on initiatives and approaches that are not appropriate for their needs or context. Often, schools are looking for a ‘quick fix’, to solve a problem flagged in their performance data. While schools have become very well-equipped to use data to measure performance, there has, in my view, been an overemphasis on measuring performance rather than exploring exactly what the underlying issues are and how they might be best resolved through a sustainable change in professional practice. This is partly down to a lack of critical engagement and challenge from the profession.
Defining (academic) ‘research’
It is important to understand what research can and cannot bring to the profession, and this begins with understanding what research is. There is a discipline and rigour involved in academic research, and that is because research is the process of creating new knowledge and insights, in order that we can better understand and potentially improve or control something. It can only be achieved when a specific, rigorous (and ethical) approach is applied. Findings tend to be published through academic journals with peer review being an important part of the process, where experts in the same field check the research In assessment, the degree to which a particular assessment m... More and suitability for publication (just as happened to this article). Research design is an integral part of the process of research. It is (Gorard, 2013 p. 20):
“a way of organising a research project or programme from its inception in order to maximise the likelihood of generating evidence that provides a convincing answer to the research questions for a given level of resource.”
In defining research, we also ought to acknowledge the fact that there are different research approaches, determined by the nature of the research question and by the theoretical perspective of the researcher. Understanding different research approaches can be quite daunting but there is extensive literature presented in accessible formats to guide you. For example, in the Chartered College of Teaching’s ‘Compact Guides’, Jones and Netolicky (2019) present a hierarchy of research, ranking different approaches, from A quantitative study design used to systematically assess th... More to cross-sectional surveys. This is a useful starting point, although ranking is perhaps not the best way to look at research; rather, one should understand further the opportunities and limitations of each approach. For example, a case study can allow a researcher to really drill down into a particular research area, and while findings may be very context-specific, they allow for a much deeper, nuanced understanding of a situation; however, inferring causality and generalising on a larger scale is not possible. Conversely, meta-analyses (see, for example, John Hattie’s Visible Learning (2008)) can provide robust and reliable findings at scale but it is challenging to identify contextual variation (Breckon, 2016). Each approach is suited to the research question or focus.
Defining ‘craft knowledge’
‘Craft knowledge’ (Wilson, 2013), the knowledge that we develop through the experience of our roles – knowledge of our practice, our children, our school community and the local community – is also important. Wilson (2013) describes this as (Wilson, 2013, p. 2):
“hard to verbalize because it is expressed through action-based skills, is difficult to make explicit or to represent in textual form because it is largely acquired informally through participation in teaching situations, and it is often so “taken for granted” that teachers are often unaware of its influence on their behaviour.”
This craft knowledge is important in the day-to-day operational activity within the school. However, as Weinstein and Sumeracki (2018) suggest, teaching is often more informed by intuition than by research, and this intuition can lead us towards practices that have little or even a negative impact on teaching efficacy or learning outcomes. An additional problem with relying on intuition is how we can positively reinforce intuition by ‘our tendency to seek out information that supports rather than disproves our beliefs’ (Weinstein and Sumeracki, 2018, p. 12). In my opinion, an overreliance on intuition has allowed approaches such as ‘learning styles’ – many of which ‘appear logical’ – to inform practice with no evidence of their impact on learning. If decisions are only based on intuition, we run the risk of not challenging our thinking and perspectives, and of promoting poor practice.
How to critically engage
Therefore, I argue that there is a crucial role for research to inform and even challenge existing practice. There is a crucial space that exists between the two, and this space is best utilised when the profession critically engages with both its own context and that of research. I would like to illustrate the process with two examples.
First, a good example of critical engagement can be seen in Coe’s Impact article about the extent to which retrieval practice can be translated into classroom practice (Coe, 2020). Coe carefully explores the context and limitations of the research that underpins retrieval practice, and while not challenging the approach, he does highlight that translating it to the classroom is more nuanced than perhaps we might assume.
Second, I would like to describe how we asked questions of evidence in order to seek alternative solutions to school improvement. The context is city-wide school performance data that indicated in a number of measures that schools were underperforming and had been for some years, despite resources being invested in targeted CPD. This related to phonics, reading, writing and mathematics, and appeared to be a problem from Early Years all the way through to Key Stage 4. This was the ‘headline data’ and was normally the driver for some form of CPD-related response. In leading a Teaching School responsible for brokering and developing school improvement, my response was to ask questions of the data, explore its limitations and consider alternative evidence. I looked beyond the headline data, listened to school leaders and considered the potential influence of deprivation, low literacy among parents, poor parental engagement with schools, and CPD models. Our ‘hunch’ was that resources and ineffective CPD were targeted in the wrong areas, and one area that had not been considered was language development and ‘oracy’. This then led to an exploration of several effective oracy projects elsewhere in the country, exploring the research behind them, the context and impact, and a consideration of how well the approaches might suit our specific context and need. The result was a city-wide oracy project, tailored for our specific communities and with sustainability and collaboration at the core. The impact has been fantastic to witness.
What follows are some sources of useful guidance to support this process and implementing change.
Informed Health Choices (nd), an international multidisciplinary group, has published a framework for ‘thinking critically about claims, evidence and choices’ called ‘That’s a claim’. The framework enables the careful questioning of research and evidence in a systematic way, offering three stages to the questioning. They provide a useful poster for educationalists to use when considering research and evidence (also see Müller et al., 2020).
I have referred to the EEF’s ‘A school’s guide to implementation’, and the updated version (2019) details the exploratory phase and guidance on ‘gathering and interpreting data to identify priorities’.
The May 2017 edition of Impact (the very first edition!) focused on evidence-informed practice. Mannion’s (2017) article explores the importance of professional judgement, and Bishop (2017) explores the importance of asking questions.
NESTA’s Alliance for Useful Evidence seeks to inform those working with social policy and practice, including teachers, on the ‘smarter use of evidence’. In their excellent ‘Using research evidence: A practice guide’ (Breckon, 2016), different research approaches are presented and summarised, alongside pros and cons (see pp. 21–23).
In this article, I have made the case for the teaching profession to be research- and evidence-informed. For this to be effective and for research to meaningfully impact on practice, the process of critical engagement, I have argued, is crucial. As a profession, we need to remain open-minded and prepared to continually learn and reflect, as research evolves and improves our understanding of what works and what does not. A healthy pinch of scepticism should be used, and an acceptance that ‘best practice’ is ‘best, not in an absolute sense, but in a comparative sense. It is the “best so far”’ (Syed, 2019, p. 230).
Bishop C (2017) What doesn’t work in education? John Hattie and the importance of asking questions. Impact (Interim Issue): 42–43.
Breckon J (2016) Using research evidence: A practice guide. NESTA. Available at: https://media.nesta.org.uk/documents/Using_Research_Evidence_for_Success_-_A_Practice_Guide.pdf (accessed 22 June 2020).
Coe R (2020) Does research into retrieval practice translate into classroom practice? Impact 8: 12–13.
Education Endowment Foundation (EEF) (2019) Putting evidence to work – a school’s guide to implementation. Available at: https://educationendowmentfoundation.org.uk/tools/guidance-reports/a-schools-guide-to-implementation (accessed 1 May 2020).
Gorard S (2013) Research Design. London: SAGE Publications.
Hattie J (2008) Visible Learning. Abingdon: Routledge.
Informed Health Choices (nd) That’s a claim. Available at: www.thatsaclaim.org/educational (accessed 1 May 2020).
Jones G and Netolicky D (2019) Research-informed practice. Chartered College of Teaching Compact Guides. Available at: https://my.chartered.college/resources/compact-guides/research-informed-practice-by-gary-jones-and-deborah-netolicky (accessed 1 May 2020).
Mannion J (2017) Evidence-informed practice: The importance of professional judgement. Impact (Interim Issue): 38–41.
Müller LM, Morris A, Sharples J et al. (2020) Assessing claims in education – the ACE concepts. Impact 8: 60–63.
Syed M (2019) Rebel Ideas: The Power of Diverse Thinking. London: John Murray.
Weinstein Y and Sumeracki M with Caviglioli O (2018) Understanding How We Learn: A Visual Guide. Abingdon: Routledge.
Wilson E (ed) (2013) School-Based Research: A Guide for Education Students, 2nd ed. London: Sage.