Impact Journal Logo

Evidence-informed professional development: Blueprints or jigsaws?

Written by: Thomas Perry  Rebecca Morris
9 min read
 Thomas Perry and Rebecca Morris, University of Warwick, UK

Implementation: Easier said than done

Underlying a lot of current thinking about evidence-based practice is the idea of teachers selecting and implementing the programmes and strategies that researchers deem to be effective. In this division of labour, researchers find out ‘what works’ and teachers are tasked with figuring out the practical details to make programmes and practices work with sufficient ‘fidelity’ (i.e. adherence) to a research-backed blueprint.

Vital to this conception of evidence-based practice is the notion of ‘implementation’. The Education Endowment Foundation’s (EEF) implementation guide goes as far as saying that ‘implementation is what schools do to improve’ (Sharples et al., 2018, p. 3). It describes implementation as a form of project management supported by and incorporated within a wider scheme of school self-evaluation, leadership and improvement. The process of implementation requires schools to: i) identify evidence-based programmes or practices that will meet their needs; ii) develop a ‘clear, logical and well-specified’ implementation plan; iii) deliver the implementation plan, paying particular attention to professional development and using data to ‘drive faithful adoption and intelligent adaption’; and iv) plan for sustaining and scaling the approach, with continued support and encouragement for ongoing implementation (Sharples et al., 2018, p. 5).

What is clear from the EEF implementation guide is the sheer amount that schools need to know and do prior to, during and following the delivery of an evidence-based programme or practice in order for it to be a success. For example, consider what is required to produce a detailed implementation plan that sets out the ‘why’, ‘what’, ‘how’, ‘how well’ and the expected outcomes of a ‘targeted, yet multi-stranded, package of implementation strategies’ (Sharples et al., 2018, pp. 20–21). In our experience, the information provided along with evidence-based programmes and practices often falls far short of providing what is needed for understanding both the intervention and how it should be implemented. It doesn’t help that only one in five education trials include a ‘significant or notable’ implementation and process evaluation (IPE), and for nearly two-thirds (62.4 per cent) of trials the IPE is non-existent (Connolly et al., 2018, p. 287). These IPEs, when done well, can provide important, in-depth information about how an intervention works in practice, in different contexts or with different groups of teachers or children. Even for EEF trials, which are usually some of the highest-quality studies available, a minority have ‘high quality’ IPE (30 per cent), with 22 per cent being ‘low quality’ (Maxwell et al., 2021).

There is also a need to drive implementation forward using an effective programme of professional development, ideally with ‘highly skilled coaches’ and ‘structured peer-to-peer collaboration’ (Sharples et al., 2018, p. 34), and ‘in the context of a ‘leadership environment and school climate that is conducive to good implementation’ (Sharples et al., 2018, p .6). To be successful, evidence-based practice requires a lot of teacher and school leader expertise, along with well-established systems and a culture of school improvement and professional learning. 

This all means that the implementation of programmes and practices supported by evidence is less like constructing a building from a blueprint, and more like assembling pieces of a jigsaw, with the evidence and information provided by research offering just a small handful of the pieces, and practitioners needing to locate and place the rest of the picture.

Pieces of the jigsaw

Our knowledge of how to get evidence into practice is still limited (Gorard, 2020). Having made the decision to implement an evidence-based programme or practice, what else must teachers and school leaders know and do to make it a success? In this section, we reflect on our experiences of two recent projects – the EEF’s ‘Cognitive science in the classroom’ review (Perry et al., 2021) and the EEF’s FLASH Marking evaluation (Morris et al., 2022) – and what they tell us about the missing jigsaw pieces that practitioners need to make evidence-based programmes and practices work on the ground. We loosely organise this around four Ps: Principles, Practices, People and Places.

Principles

Education trials are designed to test whether an intervention works rather than how. With such weak IPEs, it is rare that strong understanding of a programme’s educational principles is developed, tested and reported during a trial. The know-how is usually there, but it is often to be found in the (mostly tacit) understanding and expertise belonging to the intervention developer and/or a small number of key practitioners. Where available, good programmes convey the principles of how programmes or practices work.

Taking FLASH Marking as an example, we can see that the intervention was rooted in principles around formative assessment and feedback. Codes were used as a way of providing feedback on students’ work, with a view to increasing the quality and regularity of feedback, supporting their learning behaviours and progress, and reducing teachers’ workload. But the developers were clear that it was not just a case of teachers adopting a sheet of marking codes. Instead, teachers were provided with a series of day-long, subject-specific, practice-focused CPD sessions with other teachers across the trial. Through these sessions, colleagues were led through the underpinning evidence and principles that had informed the intervention. The IPE revealed the value that teachers placed on these CPD sessions: the knowledge and expertise of the development team was highlighted as a key factor in supporting the why of FLASH Marking as well as the how. 

Similarly, a key theme arising from the ‘Cognitive science in the classroom’ review was the importance of teachers understanding principles of learning from cognitive science. The applied evidence base is patchy, and rarely takes considerations such as subject, curriculum, pupil needs and teacher experience into account. Therefore, some of the best practice based on cognitive science involves teachers with strong understanding of cognitive science principles, intelligently interpreting and applying core cognitive science strategies across complex classroom environments.

Practices

New and enhanced practices are a common focus of CPD. Even so, a considerable amount of work is needed for teachers to adopt new strategies, contextualising them for their subjects, curriculum and classes, as well as developing resources, routines and habits to embed them in their practice. This applied practice cannot be easily specified within CPD. However, effective CPD can create the space and time for rehearsal of ‘faithful adoption’ and active experimentation for ‘intelligent adaption’ and contextualisation (Cordingley, 2015; Sharples et al., 2018, p. 5; Sims et al., 2021). This is hard to do well, both in the context of formal CPD programmes and within a school’s wider professional learning culture. The difficulties of effectively applying cognitive science at a scale identified in the cognitive science review highlight many important insights about evidence-informed CPD. Three points made by Yang and colleagues (2020, p. 558) are particularly pertinent:

The Cog-sci teachers might have benefited more if our professional development (PD) had offered more direct experiences with the optimal learning environment they are expected to construct …

Previous research has shown that what teachers learn in PD depends largely on the existing knowledge they bring to the activity and that they can have quite different takeaways from their learning experiences …

Personalizing PD to address teachers’ particular circumstances, knowledge, and experience holds promise for increasing their effectiveness.

These elements were also evident in the FLASH Marking evaluation: the professional development sessions and ongoing input from the development team were planned and delivered in a way that enabled teachers to situate the intervention in their existing school contexts, and to relate it to department members’ areas of expertise and interests. Teachers reported being particularly pleased with the flexibility of the approach and the fact that they could adapt the intervention to suit the texts that they were teaching, the exam boards with whom were working and the needs of the classes that they taught. The provision of good resources and support can provide a scaffold for teachers to develop new practices, and high-quality CPD can empower teachers to take up these resources and use their professional judgement to embed them within their own classrooms.

People

Introducing a new intervention is necessarily a team effort. While a single person might make the decision to bring in the intervention, if it is to work, then mobilising a wider group of people is essential. This involves getting the whole staff body (including teaching assistants) on board to support with the embedding and delivery of the programme and communicating it effectively to pupils and sometimes parents.

We shouldn’t lose sight of the fact that professional learning is about people – their knowledge, skills, habits and values. There is a danger of a ‘de-peopled’ description of what interventions entail. As Steve Higgins observes (Higgins, 2018, p. 70), using the example of reciprocal teaching, the shorthand of saying it ‘works’…

airbrushes the teacher, the learners and their interactions from the picture and obscures the actions that they need to take for the use of reciprocal teaching to be successful. Who is it that is doing the “work”, the intervention, or the teacher and the learners?

Many interventions are quite specific about who should be responsible for implementation in schools. This often requires careful personnel management in terms of having the ‘right’ people in the specified roles, and organising that participation around the myriad other roles that these colleagues hold. Wider challenges around teacher shortage or cuts to funding leading to reductions in TA support can have significant impact on the availability of staff for intervention implementation.

Sharing and developing knowledge across a large group of people presents many logistical and educational challenges. Training in the FLASH Marking intervention was delivered via a cascade approach, where the developers delivered their CPD sessions to two members of each participating school’s English department. These teachers were then required to go back to their schools to share the training with their colleagues. While cascading CPD is often viewed as efficient, evaluations of school-based trials often point to challenges such as teachers not having enough time to deliver training effectively, key information or principles being omitted, and limited opportunities for follow-up or monitoring of how the training has been delivered. In the FLASH trial, the developers put careful thought into mitigating these challenges and how heads of departments could navigate them to ensure high levels of participation, support and implementation.

Places

All of this must play out in the context of an education setting. School communities create vibrant and changeable environments. The implications of this for evidence-informed CPD are numerous. These include operational factors – e.g. communication, administration, resources, timetabling, staffing, workload and so on – which often make or break whether interventions are successfully embedded or not. There are also important factors relating to school culture. Leaders have a crucial role in creating ‘supportive environments, which enable evidence-informed practice to flourish’ (Cordingley et al., 2020; Nelson & Walker, 2019, see key finding 2). As Coldwell et al. (2017, p. 7) observe:

The most strongly research-engaged schools were highly effective, well-led organisations within which “research use” meant integrating research evidence into all aspects of their work as part of an ethos of continual improvement and reflection.

For programmes and practices based on evidence to work in classrooms, teachers must actively and expertly adapt everything, from plans and schemes of work, resources, displays and the physical classroom environment, to their own practice (see above), to the routines, expectations and ‘climate’ of their classrooms.

Assembling the jigsaw

Researchers do not and cannot have a blueprint for practice. Conceptions of evidence that see research having the ‘answer’ to practical problems, in our view, over-reach what research can provide. Moreover, prescriptive approaches to using evidence can undermine rather than enhance professional expertise. For it to work, researchers need to see that, inevitably, practitioners hold most of the pieces of the jigsaw required to make evidence work. Equally, we should recognise the fact that research and evidence can provide important pieces of the jigsaw to support professional judgement in the context of school improvement and professional learning. 

So how can practitioners assemble the jigsaw? Ultimately, it comes down to the creation of effective professional development and learning environments. These provide a context in which teachers and school leaders can think through and work on problems of practice, and bring together elements such as those we have described. There aren’t easy answers, but we do have a good understanding of effective professional development (Cordingley, 2015; Sims et al., 2021), implementation (Sharples et al., 2018) and leadership (Early and Greany, 2022). We also have many expert practitioners who know what excellent professional development and classroom practice looks like on the ground, and a system which at its best allows such knowledge to be shared. Research doesn’t provide a blueprint for any of this; however, when researchers and practitioners work in genuine partnership, it can make for an especially powerful combination.

    0 0 votes
    Please Rate this content
    Subscribe
    Notify of
    0 Comments
    Inline Feedbacks
    View all comments

    From this issue

    Impact Articles on the same themes