Impact Journal Logo

An evidence-based approach to CPD

Written by: Rob Coe and Stuart Kime
8 min read
Rob Coe, Director of Research and Development, Evidence-Based Education, UK
Stuart Kime, Director of Education, Evidence-Based Education, UK

The existence of this issue, and indeed of the Impact journal itself, is evidence of a depth and breadth of practitioner engagement with research evidence that is wonderful to see. Class teachers, school leaders, and those who work directly with them to support their development, are engaging with research, applying it to practical questions, developing high levels of expertise in their knowledge of the findings and methods of educational research, and effectively sharing their reflections and insights. To us, all of this feels new, important and exciting.

However, let us sound a note of caution.

We know that engagement with research is not enough. A number of programmes – including one in which both of us contributed to the design and delivery (Wiggins et al., 2019) – have evaluated the impact of enhancing teachers’ and school leaders’ knowledge about research, with no real evidence of impact on subsequent student outcomes (Brown et al., 2021). For those who have engaged with research evidence, seen the light, reflected on it and tried to incorporate it into their practice, it may seem self-evident that knowledge of educational research has been transformational. And, indeed, it may have been, for them. In part, this paradox is explained by the blurring of the important conceptual distinction between a description of what appeared to work in one context, and being able to give generalisable advice about what will work in another.

‘What worked for us’ vs ‘What will work for you’

When asked for the secret of his success, billionaire J. Paul Getty is reported to have said: ‘Rise early, work hard, strike oil.’ As a description of how he became successful it may be accurate, albeit over-simplified; as advice to anyone else, it is worthless. The point is that a description of what worked for you should not be mistaken for advice about what will work for someone else.

Descriptions based on personal experience can provide rich insight and provoke valuable reflection – the paper in this issue by Lofthouse et al. is a great example. As we read them, we may think ‘Maybe that would work in my context’ or ‘This helps me to think about what might work in my context, and why.’ And good descriptive research is careful not to make this transfer claim: the reader, not the writer, must judge its applicability.

Even here, though, we should be cautious about taking those descriptions at face value. In the example of a teacher who is sure their practice has been transformed by research evidence, there may be other explanations. Despite their perception that their teaching is now radically different, it is possible that an observer would see little change. Or that their practice has indeed changed, but its impact on their students’ outcomes has not. Or that the change, while attributed by them solely to their reading, studying and thinking about educational research, would not have happened in the absence of other factors whose role may be hard to disentangle (such as the support and critical-friend challenge of colleagues, school culture and leadership support). Or, perhaps, given these other support factors, it would have happened anyway, even without any interest in research evidence. Our own perceptions of our behaviour, and of its causes, are not always trustworthy.

If we do want to offer advice to others, then the evidential bar must be much higher. There are too many examples of approaches that are obviously successful, at least in the perceptions of those involved in the context in which they have been developed, but that fail to generalise. When these programmes are evaluated robustly, we too often find that the impact is too small to be detectable – or even negative (see, for example, Humphrey, et al, 2020).

Implementation

A further challenge with giving advice or trying to specify a programme of improvement is that there is very little in education that can be implemented according to a recipe or manual – and remain effective. The challenge of implementation can be thought of, mostly, as a problem of expertise: doing complex things well depends on knowledge, intuition, insight and skill to make appropriate adaptations to the context, prioritise what really matters and mobilise the people and resources around us to realise the intended programme in ways that are effective and sustainable. It follows, therefore, that we need to build that expertise deliberately and systematically if we want to help teachers and leaders get better at the things that really matter to them and their students.

And, just as Barker and Rees argue that leadership skills are not generic but depend on domain-specific knowledge, the same is true for implementation: it almost certainly depends on domain specific knowledge, too.

So, this is hard; should we just give up?

No, definitely not!

One clear takeaway is that if implementation depends on specific expertise, then we need to generate that expertise. High quality CPD that focuses on developing the knowledge, skills and understanding required to implement a particular programme effectively must be built in.

Another is that implementing complex programmes that require adaptation (i.e. most educational interventions) is unlikely to be successful without effective, real-time evaluation. Just having enough of the right kind of expertise may help to inoculate against those lethal mutations, but it is not a guarantee (and is also probably rarer than we would like). So, constant monitoring of both what is being done and its outcomes provides feedback to inform those adaptations. Without that kind of feedback, only the most expert are likely to implement well; but with good feedback, almost anyone can see the impact of their adaptations and learn to make an intervention work.

Of course, this kind of formative evaluation is not easy to do well and, once again, we return to the need for high-quality CPD to build the expertise required. But what should that high-quality CPD look like? To help answer this question, let’s start from the assumption that teachers and leaders are learners. What do we know about supporting learners?

Every teacher is a learner

Teachers know a lot about learning and how to make it happen, but it seems all too common for that knowledge to be disregarded in the design, delivery and experience of their own professional learning. It would be a routine expectation in most schools and colleges to provide students with a sequenced curriculum that maps out essential learning aims, diagnostic assessments, models, scaffolds, feedback, practice and more feedback. To what extent are these supports routinely provided to teachers in their own professional learning?

Comparing the ways we routinely help pupils to learn hard ideas or processes and the things we do to support teachers’ professional learning provides a useful check on any professional learning strategy. For example, if a school’s approach to CPD consists of self-help groups of teachers working together on challenging content, observing and supporting each other, would you expect a similar approach to work for pupils trying to learn hard ideas like letter sounds, column subtraction, urbanisation, the origins of WW1 or thermodynamics? Self-help groups may be a part of the story, but without a lot of other inputs they may well not be as effective as we might hope.

Another useful comparison is with learning practical skills such as golf, tennis, football, piano, guitar, cookery or cabinetmaking. Here typical approaches involve coaching by an expert, often one-to-one or in a small group, with an emphasis on spending a lot of time in ‘deliberate practice’ of the skill (Ericsson, 2006; Ericsson et al., 1993). If you think you can learn to be a better teacher by reading books and blogs, attending presentations and conferences, reflecting and having intense conversations with colleagues, could you see a similar approach working to improve your skill in darts, yoga or chess? These reflective activities may be useful, but you would need to do a few other things as well.

Helping teachers to gain new knowledge, to develop insights and understandings of relevant underpinning theory, to build skills and techniques and to acquire and embed new habits, can all be thought of as a learning process. That means we are firmly in the territory of applying what we know about the conditions that optimise learning to the special case of professional learning.

Like every child or young person, every teacher is a learner. Current research in cognitive neuroscience indicates that ‘the child brain has essentially the same structures as the adult brain, carrying out essentially the same functions via the same mechanisms’ (Goswami, 2020). Research on teacher effectiveness also suggests that what teachers know and do makes more difference to student outcomes than anything else we can change (Coe et al., 2020). Teachers are learners, professional learning is like any other kind of learning, and the potential for transformative change it offers gives both hope and challenge. And it’s critical to acknowledge that all teachers can learn to get better, however good they are right now. The life chances of the children and young people this year, next year and onwards depend on how well they do their job. What could be more important than that?

Research also suggests that features of a school’s environment and culture (such as the approach to monitoring and performance management, prioritisation of professional learning, collaboration, trust, shared challenging goals, shared efficacy beliefs, safety, order and support) can make a substantial difference to the quality and impact of professional learning, and affect student attainment directly, as well as teacher retention (Kraft and Papay, 2014; Weston et al., 2021). If we want to help teachers to improve their everyday teaching, we must pay attention to these factors too.

More to learn

The articles in this issue of Impact, and wider engagement with research evidence about CPD seen in books such as Zoe and Mark Enser’s ‘The CPD Curriculum’ (Enser and Enser, 2021) give cause for great hope and excitement: more and better evidence continues to become available, and the appetite for it grows. But as we learn more from it, that hope and excitement should be tempered with caution: we must remain ready to be surprised by the evidence, and to change with it.

References

Brown C, Poortman C, Gray H et al. (2021) Facilitating collaborative reflective inquiry amongst teachers: What do we currently know? International Journal of Educational Research 105: 101695.

Coe R, Rauch CJ, Kime S et al. (2020) The Great Teaching Toolkit: Evidence Review. Available at: https://evidencebased.education/great-teaching-toolkit/ (accessed 11 August 2021).

Enser M and Enser Z (2021) The CPD Curriculum: Creating Conditions for Growth. Carmarthen: Crown House Publishing Ltd.

Ericsson KA (2006) The influence of experience and deliberate practice on the development of superior expert performance. In: Ericsson KA, Charness N, Feltovich PJ (eds) The Cambridge Handbook of Expertise and Expert Performance. Cambridge: Cambridge University Press, pp. 685–705.

Ericsson KA, Krampe RT, and Tesch-Römer C (1993) The role of deliberate practice in the acquisition of expert performance. Psychological Review 100(3): 363.

Goswami U (2020) Toward Realizing the Promise of Educational Neuroscience: Improving Experimental Design in Developmental Cognitive Neuroscience Studies. Annual Review of Developmental Psychology 2: 133–155.

Humphrey N, Squires G, Choudry S et al. (2020) Achievement for All: Evaulation report. Available at: https://educationendowmentfoundation.org.uk/public/files/Projects/Evaluation_Reports/Achievement_for_All_(final).pdf (accessed 11 August 2021).

Kraft MA and Papay JP (2014) Can professional environments in schools promote teacher development? Explaining heterogeneity in returns to teaching experience. Educational Evaluation and Policy Analysis 36(4): 476–500.

Weston D, Hindley B, and Cunningham M (2021) A culture of improvement: Reviewing the research on teacher working conditions. Available at: https://tdtrust.org/wp-content/uploads/2021/02/A-culture-of-improvement_-reviewing-the-research-on-teacher-working-conditions-Working-Paper-v1.1.pdf (accessed 11 August 2021).

Wiggins M, Jerrim J, Tripney J et al. (2019) The RISE project: Evaluation Report. Available at: https://educationendowmentfoundation.org.uk/public/files/Projects/Evaluation_Reports/RISE_Report_final.pdf (accessed 11 August 2021).

      0 0 votes
      Please Rate this content
      Subscribe
      Notify of
      0 Comments
      Oldest
      Newest Most Voted
      Inline Feedbacks
      View all comments

      From this issue

      Impact Articles on the same themes