Paul A Kirschner, Emeritus professor, Educational Psychology, Open University, Netherlands; Guest professor, Expertise Centre for Effective Learning (EXCEL), Thomas More University of Applied Sciences, Mechelen, Belgium
Tim Surma, Manager, Expertise Centre For Effective Learning (EXCEL), Thomas More University of Applied Sciences, Mechelen, Belgium
Welcome to this issue of Impact on evidence-informed pedagogy. The reason why we have collated a publication with this theme is simple and really straightforward: if we, as educational professionals, choose to inform the choices that we make for our practice with the best available evidence, we can make meaningful, striking enhancements to our pedagogical practice, and thus to the efficiency, effectiveness and success of our teaching and of children’s learning.
What is evidence-informed pedagogy?
Some educational policy-makers, politicians and teachers use the term ‘evidence-based’ when they speak of instruction and teaching, while others (we, for example) use the term ‘evidence-informed’. Is there a difference and, if so, what is it? There is a distinction, albeit sometimes subtle, between evidence-based and evidence-informed in terms of practice in education. Originating in medicine but now used across numerous professions such as economics, technology and agriculture, an evidence-based practice is an approach to practice that focuses practitioner attention on sound empirical evidence in professional decision-making and action (Rousseau and Gunia, 2016). In medical research, for instance, research processes are more rigorous, well-defined and easily controllable than in educational sciences, which makes outcomes more distinct and reliable. As Neelen and Kirschner (2020, p. 3) state:
“Sackett et al (1996) see it as a three legged stool integrating three basic principles: (1) the best available research evidence bearing on whether and why a treatment works, (2) clinical expertise of the health care professional (clinical judgment and experience) to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions, and (3) client preferences and values.”
Here, everything is clear cut. The target population is clearly defined with respect to age, weight, disease and so forth. Further, the directions for use are clear cut – for example, that the medicine should be consumed on an empty stomach, one hour prior to eating.
Evidence-informed practice is still based on empirical evidence, but acknowledges the fact that it’s harder for real classroom practice to determine what works for whom under which circumstances. What seems to work in one classroom does not always work in another classroom. Five-year-olds are different from 15-year-olds with respect to both their cognitive development and their knowledge and expertise; a lesson on concepts and definitions is different from a lesson on applications; and, to a lesser extent, a lesson in chemistry differs from a lesson in drawing. Further, what works for one teacher might not work for another because teachers differ qualitatively; subtle and not so subtle differences between teachers mean that the way in which they carry out the same thing differs, both in how it is carried out and how it is perceived by their students. Also, what works in a lesson today won’t necessarily work in the same lesson this afternoon, tomorrow or in three months. Just the fact that learners are different with respect to their prior knowledge, beliefs, needs and/or motivations to participate can change everything. Unfortunately, this entropy (i.e. lack of order or predictability) in the classroom does not allow us to predict with statistical ‘certainty’ which intervention will yield which effect and when. Even in perfect circumstances with the best prepared lessons, some of our students might still underperform, despite the evidence brought to us by eminent cognitive and educational psychologists. While ‘evidence-based’ provides fairly hard results, ‘evidence-informed’ is less hard but still very useful, with a higher chance of success if applied thoughtfully. This is why in this issue we advocate a pedagogy informed by evidence, more than a pedagogy based on (or dictated by?) evidence. The challenge of going from the evidence to the design of actual pedagogical practices in the classroom calls for a deep understanding – let’s call it pedagogical knowledge – of what, why and when something works in optimal conditions in order to have, for example, conversations with your fellow teachers and headteachers on certain pedagogical decisions or actions.
The literature also presents a variety of accounts of exactly what pedagogy is. Since pedagogy has both broad and narrow definitions, we have had to make a choice and have chosen to follow Dylan Wiliam (Black and Wiliam, 2018) in using the broad definition. He cites Alexander (2008) in stating that pedagogy is ‘what one needs to know, and the skills one needs to command, in order to make and justify the many different kinds of decision of which teaching is constituted’ (p. 47). It can be seen as the act and discourse of teaching (Alexander, 2004). Pedagogy therefore includes instruction but is broader and also embraces the interplay between factors that influence teaching and learning. Both evidence-informed practice and pedagogy assume that the educational professional knows what the best options for optimal teaching and learning might be under given circumstances (knowing your repertoire as a teacher).
What do we know about teacher repertoires? We have already learned a lot about classroom practices from the abundance of quality research conducted in laboratories and schools, online and offline, and virtually anywhere and anytime when teaching and learning take place. The evidence is out there. Over the past few decades, researchers have designed interventions and devised general techniques – frequently based on a two-way-street interaction between teachers and researchers – that work or do not work for particular learners of particular ages undertaking particular academic tasks in particular subject areas (see, for example, Roediger and Pyc, 2012). Some fundamental techniques from cognitive and educational research have been derived from this substantial base of empirical research, and some of these are gaining attention as they hold the potential of being sufficiently general that they can be applied in a range of academic subject matter areas and readily implemented in classrooms across all ages (for an overview of learning strategies, see Dunlosky et al., 2013). Several examples of effective techniques are elaborated on in this issue, as these general approaches may need domain-specific adjustments to maximise their promise as learning tools for particular domains, which in itself is a shining example of evidence-informed practice. Retrieval practice – the act of engaging in active recall of already-learned information (see Roediger and Karpicke, 2006) – is adapted to the perspective of CPD by Alex Beauchamp; Clare Badger transfers cognitive load theoryAbbreviated to CLT, the idea that working memory is limited ... More (Sweller, 1988) into practical guidelines for chemistry courses; the provision and function of individual feedback (Hattie and Timperley, 2007) is tackled by Caroline Locke; and popular learning myths (Kirschner and Van Merriënboer, 2013) are challenged by both Jonathan Firth and Jennifer Zyke, as well as by Lewis Baker.
Some other evidence principles are much less domain-general, which is why they were once referred to as theories of subject matter by Richard Mayer (2004), and some of these are still conceived as a ‘unique and monumental contribution of educational psychology to the science of learning’ (Mayer, 2018, p. 175). Striking examples are the theory of how people learn to read (i.e. explicit phonics instruction), learning a second language and so forth. This issue also pays attention to the subject-specific uniqueness of teaching, with a focus on a selection of subjects that are less often in the spotlight of educational research, such as arts (by Laura Gatward) and physics (by Cecilia Astolfi).
Although we may now sound endlessly full of self-confidence regarding evidence informing education, we must of course temper our enthusiasm to some extent: we obviously do not know the answer to every question, for the simple reason that education does not take place in isolation. The evidence is out there – but it’s neither definite nor complete. As an example, the concept of affect—which refers to students’ experience of emotion—is gaining growing recognition as an essential component in teaching and learning but still holds many secrets for both researchers and seasoned teachers (Mayer, 2018). Therefore, some consideration was given in this issue to educational outcomes beyond retention of basic declarative and procedural knowledge. Several articles explore pedagogies such as playful learning in the Early Years (by Sarah Seleznyov), reading for pleasure (by Alice Reedy) and the crossroads between coaching and direct instructionA method of instruction in which concepts or skills are taug... More (by Ed Cope and Chris Cushion). Given the broad range of content areas represented in this issue, our readers should not be surprised that the educational outcomes in this issue differ greatly.
The UK occupies a pole position worldwide with educational undertakings, supporting the implementation of structured, evidence-informed education. We think of influential research centres such as the Education Endowment Foundation, or professional learning communities such as the Chartered College of Teaching and the ever-increasing researchED community. A number of articles in this issue zoom in on this fascinating but complex interplay between research and practice. Andrew Davis elaborates on classroom research, Richard Churches and colleagues shine a light on teacher-led randomised controlled trials, and Lorne Stefanini and Jenny Griffiths address some challenges when implementing an evidence-informed approach to education.
This issue might be what David Daniel (2012) described as a targeted investment in translational research: with this issue, the Chartered College of Teaching supports the development of pedagogical approaches with the goal of understanding how, when and under what constraints to apply best-evidence strategies in relevant educational contexts. Readers will find a multiplicity of approaches in the current issue, all aimed at revealing how to inform your pedagogy with the best available evidence. We hope that this issue can help you to make more and better evidence-informed decisions.
Enjoy, learn and use the content to reflect upon and improve both your teaching and your students’ learning!
Alexander R (2004) Still no pedagogy? Principle, pragmatism and compliance in primary education. Cambridge Journal of Education 34: 7–33.
Alexander R (2008) Essays on Pedagogy. London: Routledge.
Black P and Wiliam D (2018) Classroom assessment and pedagogy. Assessment in Education: Principles, Policy & Practice 25: 551–575.
Daniel DB (2012) Promising principles: Translating the science of learning to educational practice. Journal of Applied Research in Memory and Cognition 1: 251–253.
Dunlosky J, Rawson KA, Marsh EJ et al. (2013) Improving Students’ Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology. Psychological Science in the Public Interest 14(1): 4–58.
Hattie J and Timperley H (2007) The power of feedback. Review of Educational Research 77: 81–112.
Kirschner PA and van Merriënboer JJ (2013) Do learners really know best? Urban legends in education. Educational Psychologist 48: 169–183.
Mayer RE (2004) Teaching of subject matter. In: Fiske ST (ed) Annual Review of Psychology, Vol 55. Palo Alto, CA: Annual Reviews, pp. 715–744.
Mayer RE (2018) Educational psychology’s past and future contributions to the science of learning, science of instruction, and science of assessment. Journal of Educational Psychology 110: 174–179.
Neelen M and Kirschner PA (2020) Evidence-Informed Learning Design: Creating Training to Improve Performance. London: Kogan Page.
Roediger HL and Karpicke JD (2006) The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science 1: 181–210.
Roediger H and Pyc M (2012) Inexpensive techniques to improve education: Applying cognitive psychology to enhance educational practice. Journal of Applied Research in Memory and Cognition 1: 242–248.
Rousseau DM and Gunia BC (2016) Evidence-based practice: The psychology of EBP implementation. Annual Review of Psychology 67: 667–692.
Sackett DL, Rosenberg WM, Gray JM et al. (1996) Evidence based medicine: What it is and what it isn’t. Clinical Orthopaedics and Related Research 455: 3–5.
Sweller J (1988) Cognitive load during problem solving: Effects on learning. Cognitive ScienceThe study of the human mind, such as the processes of though... More 12: 275–285.