GEMMA MOSS, PROFESSOR OF LITERACY AND DIRECTOR OF ESRC EDUCATION RESEARCH PROGRAMME, UCL INSTITUTE OF EDUCATION, UK
RACHEL FRANCE, RESEARCH FELLOW, ESRC EDUCATION RESEARCH PROGRAMME, UCL INSTITUTE OF EDUCATION, UK

Research-informed practice in England

The concepts of evidence-based and research-informed practice have become increasingly central to school improvement in England. With its origins in evidence-based medicine (EBM), evidence-based practice (EBP) aims to identify ‘what works’ and use that information to guide professional practice. Randomised controlled trials (RCTs) and the aggregation of outcomes from well-conducted studies are the preferred sources of evidence. In line with other What Works Centres in the UK (What Works Network, 2018), the Education Endowment Foundation (EEF) was founded on this prospectus. 

The EEF’s activities focus on: summarising the best available evidence, through systematic reviews and synthesis; generating new evidence, through running trials of promising programmes and interventions, mainly using RCT designs; and promoting evidence use, through the Teaching and Learning Toolkit (EEF, nd) and topic-focused guidance reports (Edovald and Nevill, 2021). In many respects, this makes the EEF the dominant player in developing research-informed practice in England, through the choice of interventions that they trial, the topics that they write up and the summary formats that they use to inform decision-making in the field. All of this activity puts them firmly on the supply side of the ‘what works’ agenda: providing the evidence intended for others to use. 

Challenges facing evidence-based practice

But the supply of evidence is only one half of the EBP story. What happens to the research evidence in practice raises different questions. Reviews of the impact of the ‘what works agenda’ across different areas of public policy and service delivery find mixed results. Policymakers’ assumptions that clarity on the evidence would lead to tangible system improvements has not always been realised (Blunkett, 2000; Gibb, 2017). This remains a conundrum for those committed to the principles of EBP (Slavin, 2020; Kraft, 2020). It has also strengthened interest in researching the interactions between evidence and professional knowledge in context (Davies and Harrison, 2003; Powell et al., 2018; Nutley et al., 2019). This remains an active line of enquiry, including in education (Flynn, 2019; Rickinson et al., 2022).

In a period of decreased funding for educational research, there has been a significant increase in the number of randomised control trials (RCTs) conducted to clarify ‘what works’, thanks in large part to EEF funding (Connolly et al., 2018; Edovald and Nevill, 2021; REF Panel C, 2022). In the eyes of impartial observers, these trials have met the standards for high-quality RCTs, as set out by Connolly et al. (2018):

Later RCT studies were more likely to use trial registration and pre-published protocols, to analyse subgroups to look for differential effects, and to employ theory of change approaches to process evaluation to better understand why an intervention might, or might not, have an effect.

(REF Panel C, 2022, p. 165)

Yet a meta-analysis of outcomes from 82 RCTs commissioned by the EEF and 59 RCTs commissioned by the US-based funder the National Center for Educational Evaluation and Regional Assistance found that the reported effect sizes were much smaller than anticipated  (Lortie-Forgues and Inglis, 2019). Indeed, many of the trial findings were ‘consistent with both the null hypothesis of no effect and also an effect comparable to that associated with one year of maturation and instruction’ (Lortie-Forgues and Inglis, 2019 p. 164). In other words, the interventions did not outperform business as usual. 

The modest effect sizes associated with running interventions at scale raise questions about the contribution to knowledge-building in education that RCTs can realistically make (Kraft, 2020; Sims et al., 2022). There are competing explanations for the lack of expected impacts. These include that: the interventions chosen are not of sufficient quality to be successfully scaled up (Lortie-Forgues and Inglis, 2019; Sims et al., 2022); there is insufficient caution in both the design and interpretation of outcome measures (Kraft, 2020); and the evidence base does not sufficiently distinguish between programmes designed to be implemented with fidelity and generic advice that specifies implementation less clearly (Slavin, 2020). Others propose that the problems lie not with the quality of the evidence but with practitioners’ understanding, motivation or behaviours, and the extent to which they are willing to change what they do in light of the evidence that they have been given (Gorard et al., 2020; Waddell and Sharples, 2020). In a high-stakes accountability system, based on compliance with external prescription, this may seem to be the most obvious answer. Yet placing so much weight on what the evidence says and doubling down on the search for ‘levers and mechanisms that can influence change in practice’ (Waddell and Sharples, 2020, p. 5) may overlook the need for ‘intelligent adaptation of evidence to meet local context and circumstances’ (Collins and Coleman, 2021, p. 25).

The evidence on evidence use

Cross-sector studies consistently find that:

Research-based evidence alone is unlikely to be sufficiently influential to determine the direction of a policy or practice, nor should it do so. There is a need to involve a wide range of actors and ways of knowing if relevant knowledge is to be created and used in the pursuance of better policy.

(Nutley et al., 2019, p. 313)

The same applies to practice. In this light, sectors including health have been actively exploring: how professional knowledge can be brought into dialogue with evidence from research; why insights into the realities of patients/users and practitioners’ everyday lives matter; and the different ways in which user perspectives can be incorporated into research designs. This changes the dynamic in the relationships between research and practice, putting them on a more equal basis (Flynn, 2017; Farley-Ripple et al., 2018). 

How evidence is used matters at least as much as what the evidence says, and what the evidence says also depends upon the questions posed. Dialogue between the research community and stakeholder groups can happen at any point in the research cycle, not just once findings are established. This does not mean losing sight of the weight of the evidence. It does mean ensuring that research is relevant to the intended users and takes account of their knowledge and insights. (Rickinson et al., 2022). Yet education policy in England seems little inclined to foster such a space for open discussion on the value and usefulness of the research base that has come to define ‘research-informed’ practice. Attempts at knowledge mobilisation sponsored by the government presume that it will fix attainment gaps. This leads to a narrow focus on the instrumental purposes that research could serve, reinforced by the English high-stakes accountability system. Key questions associated with other outcomes go unanswered (Harmey and Moss, 2021). 

Rethinking how research and practice interact

By contrast, the Economic and Social Research Council (ESRC) Education Research Programme (www.ucl.ac.uk/education-research-programme) has set out to identify and evaluate other ways of bringing research and professional practice into dialogue. Nine research projects are currently researching two policy-relevant themes: the uses of technology in teaching and learning, and teacher recruitment, retention and development. They are doing so by giving research partners (teachers, pupils and other stakeholders) a bigger role in the direction that the research takes, using different strategies for stakeholder engagement. 

To give some examples, two projects are using qualitative approaches to explore teacher perspectives on how digital technologies are used in the classroom. One is using a case study design to explore the motivations and interests of secondary school teachers that shape their own technology use in the classroom; the other is exploring new ways of using digital technology to enhance educational and social equity, starting from observation of current classroom use. Both projects hope that the research will generate new insights that can prove as useful to those involved in edtech design as to practitioners.

Another project is creating opportunities for Early Years teachers in Wales to reflect on how to make the Welsh curriculum’s commitment to children’s rights a reality in classroom practice. The project uses a variety of different approaches to knowledge building, with children as well as beginning and more experienced teachers involved in project activities. This puts teacher agency and principles of co-production at the heart of its model of teacher development. 

In the case of teacher recruitment, retention and development, another project is using innovative methods to record and share teachers and school leaders’ perspectives on what matters most in teacher retention and resilience. By placing teachers’ voices at the heart of the enquiry and asking participants to reflect on their classrooms, their school and the wider policy context, they will use preliminary findings to co-design and test a range of strategies to build community resilience that can be rolled out to other schools and adapted to their setting. Participatory working will be embedded in each phase of the project, including data collection, resource development and dissemination.

In these ways, the programme is not only exploring specific topics but also knowledge-building about how to involve partners in research in different ways. By providing real-world experience and insights from across the four UK nations, this should throw some light on the conditions under which different forms of partnership can be successful and the infrastructure or funding that might be required from policymakers to enable such partnerships to thrive. By raising new questions and looking for new answers, the projects will be addressing a significant knowledge gap. 

If the research programme tests out new ways of building practitioner insights and knowledge into the research process, then the fact that the projects are taking place in different education policy contexts provides an opportunity to rethink how the relationships between research, policy and practice are currently framed and how else they might work together. Certainly, the programme will be encouraging more explicit reflection on the direction that research in education should take and whose voices should be included in making those decisions. In line with the programme’s commitment to stakeholder engagement, we will be running a series of activities through the programme to discuss emerging findings relevant to education practitioners, researchers and policymakers. To get involved, visit our webpages at: https://www.ucl.ac.uk/education-research-programme/events-0

3 1 vote
Please Rate this content
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments