Despite recent policies to support evidence-informed teaching, and a number of important practical developments – of which the new Chartered College of Teaching is one – we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. This paper presents findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue in late 2014. It suggests that at this point, academic research was having only a small-to-moderate influence on decision-making relative to other sources, despite teachers generally reporting a positive disposition towards research. Additionally, it suggests that this positive disposition towards research, and perceptions of research engagement, were not necessarily transferring into an increased conceptual understanding of research knowledge. Implications of the findings are discussed in the context of current developments towards evidence-informed practice, including the EEF’s own approaches to supporting research engagement and use.
Anyone with an eye on recent educational developments will be aware of discussions about ‘evidence-informed practice’ (EIP), which is why it is great to see the interim issue of the Chartered College of Teaching’s journal focusing on this issue. The College is in a unique position to act as a broker of knowledge between the teaching profession and the research community, and as a body able to signpost teachers to relevant and useful research. This is encapsulated in two of its founding objectives: creating a knowledge-based community to share excellent practice; and enabling teachers to connect with rigorous research and evidence.
But why has the College established itself with this remit, and why is it so important that teachers take an evidence-informed approach to practice? First, a number of recent research studies have suggested that evidence-informed schools play an important part in effective education systems (CUREE, 2015; Greany, 2015; Mincu, 2014; Schleicher, 2011), with a clearer understanding emerging of how research evidence can feed into those systems (Sharples, 2013). Second, there has been a recent surge in teacher demand for evidence, illustrated, for example, by the rapid rise of grass-roots initiatives such as researchEd (www.workingoutwhatworks.com) and the new Research Schools Network (https://researchschool.org.uk). This differentiates the current decade from previous years, in which calls for EIP came, predominantly, from university academics or researchers (see, for example, Hargreaves, 1996; Weiss, 1979).
Despite policies to support evidence-informed teaching, and a number of important practical developments – of which the new Chartered College of Teaching is one – we still don’t know a great deal about the current extent or depth of evidence-informed practice across schools in England. This paper presents findings from a survey co-developed by the National Foundation for Educational Research (NFER) and the Education Endowment Foundation (EEF), which captured information about this issue (full findings available at https://educationendowmentfoundation.org.uk/our-work/resourcescentre/research-use-survey/). The survey was developed to provide a measure of research engagement across a series of projects, funded by the EEF, that aim to increase schools’ awareness, and use, of research evidence (https://educationendowmentfoundation.org.uk/news/eef-launches-1.5-million-fund-to-improve-useof-research-in-schools/). It was also intended to inform the EEF’s overall approach to scaling-up and mobilising evidence – a key priority for the organisation in the second five years of its life.
The following points are important in interpreting the results:
1. Evidence is a broad term. There are many forms of evidence at a teacher’s disposal, including classroom data, pupil performance data, information from research, management data and, of course, professional judgment. It is the combined application of these different forms of evidence that creates EIP.
2. However, information from research tends to be a less frequently, or effectively, used source of evidence. For this reason, we developed our survey to focus specifically on the extent and nature of teachers’ uses of research evidence as an important component of EIP.
3. Research evidence was defined in the survey as: ‘paper or web-based articles, reports, books or summaries based on academic research’. We used the term ‘academic research’ to clearly distinguish between systematic research carried out in universities or professional research organisations and other sources such as comment pieces, books written by practitioners or practice information shared at teacher gatherings.
4. The data was collected in late 2014, hence should be seen as a snapshot of research engagement at that time. Elements of the survey will be repeated in the academic year 2017-18, hopefully providing an indication of how research engagement has changed over this period.
Teachers had a positive view of research, and generally saw themselves as research-engaged
When we asked teachers a series of direct questions about academic research, and how they felt it influenced their practice, the majority (typically two-thirds of teachers) demonstrated that they valued it, engaged with it and used it to change classroom practice.
Research had a relatively small impact on informing teachers’ decision-making, compared to other sources of information
The survey contained a variety of questions about the relative influence of different sources of information in informing decision-making about approaches to teaching and learning. The questions were framed so that they were not biased towards answers that referred to research.
When we analysed this data we found that:
- Information based on academic research or contained in online evidence platforms (see, for example, the EEF’s Teaching and Learning Toolkit) had only a small-to-moderate influence (approximately one-fifth of teachers identified these sources, with some variation according to question).
- Teacher-generated ideas, from both within and outside the teacher’s own school, had a much greater influence (between one-half and three-quarters of teachers identified these sources through various questions).
- Continuing Professional Development (CPD) was another key influence (around half of respondents identified this source). CPD was more likely to be based on teacher-generated ideas or the expertise of external consultants than on academic research.
- The sources that teachers found easiest to understand were: colleagues in their own schools; pupil performance data; CPD information; and colleagues in other schools. Information based on academic research was reportedly less easy to understand.
How we interpret these findings:
These findings show the different impressions we can form about research engagement when asking explicit questions about the use of research, compared to exploring the use of research evidence relative to other sources. It is perhaps not surprising that research emerges as a less prominent influence when it is measured alongside other important sources.
The survey indicates that teachers tend to listen to other teachers, and schools tend to draw on the support of other schools. This suggests that an important mechanism for embedding EIP is via peer-to-peer support and school networks. Drawing on these findings, the EEF is increasingly collaborating with practice partners to help disseminate and apply research knowledge, through initiatives like the Research School Network.
Teachers’ knowledge of what the academic research evidence says was mixed
In addition to capturing self-reported measures of research engagement, the survey contained two sets of objective questions, designed to provide an indication of teachers’ research knowledge and their understanding of the robustness of different research methods (research literacy). The results showed a variable range of knowledge by question, but overall a relatively low level of knowledge of the evidence on effective strategies for teaching and learning, and of the value of different types of research methods.
Teachers found questions requiring scientific or specialist research knowledge more difficult to answer correctly than questions relating to general teaching and learning. For example, over two-thirds (67 per cent) of teachers knew that the statement ‘setting pupils by ability improves learning outcomes for all pupils’ was incorrect, but less than one-eighth (13 per cent) recognised that the statement ‘drinking six to eight glasses of water per day improves pupil learning outcomes’ was also incorrect.
How we interpret these findings:
These findings illustrate the extent to which ideas can become embedded in the teaching profession when they resonate with messages outside of research. It is not surprising that teachers were unsure about statements regarding the impact of water on learning, given the widespread advertising of the benefits of drinking water and an active market in propagating educational ‘neuromyths’ (see the work of the Organisation for Economic Cooperation and Development (OECD) Centre for Educational Research and Innovation on this topic: www.oecd.org/edu/ceri/neuromyths.htm). Take ‘learning styles’, for example – there is no research evidence to support the statement that ‘individual pupils learn best when they receive information in their preferred learning style (e.g. auditory, visual, kinaesthetic)’, despite the government previously endorsing this technique (The ministerial department responsible for children’s serv... More and Skills, 2004). These findings illustrate the need for more widespread training and awareness of research evidence, through resources such as the EEF’s Teaching and Learning Toolkit.
Levels of research engagement differed by school and by teacher
We used a statistical technique called factor analysis to summarise and analyse information from the survey. Factor analysis explains variability among responses and identifies trends in the data. Any responses that are correlated across the survey are grouped together into single ‘factors’. These have greater In assessment, the degree to which the outcome of a particul... More than individual responses.
Five factors emerged through our analysis, and an additional measure was created based on the knowledge questions. We used these factors, and the knowledge score, to explore differences in responses between groups of teachers and schools. The analysis showed that:
- Senior and middle leaders were more likely to be research-engaged than classroom teachers (on all factors and across all school phases).
- Teachers in secondary schools were more likely to be research-engaged than primary school teachers (on some of the factors).
- Teachers in schools with the The Office for Standards in Education, Children’s Services... More ratings ‘requires improvement’ or ‘inadequate’ were more likely to use online evidence platforms than teachers in Ofsted-rated good or outstanding schools (across all school phases).
Our survey also contained a question that explored what the term ‘evidence-based teaching’ (EBT) meant to teachers. Interestingly, we discovered that an individual teacher’s definition of EBT was a good indicator of research engagement and wider knowledge. Teachers who selected answers containing a reference to academic research in their definition scored more highly on almost all our factors than teachers who did not.
How we interpret these findings: There appears to be a higher level of research engagement among school senior leaders than classroom teachers. This suggests that senior leaders should aim to model their own enthusiasm for research, encourage leadership teams to take an evidence-informed approach and, in turn, support their colleagues. It is unclear why teachers in primary schools were less likely than their secondary peers to use research to inform their teaching, or to have knowledge of research. Nevertheless, this finding suggests that researchers, and intermediaries between research and practice, need to put effort into raising awareness and understanding of research for the primary phase. The EEF’s current ‘campaign’ to develop evidence-based primary literacy in North East England is an example of such support.
Conclusions and further work
Our survey presents some novel ideas about how we can investigate, quantify and analyse teachers’ research engagement, but it cannot tell us everything we might like to know. It is designed to be used alongside Qualitative research usually emphasises words rather than qu... More (for example, interviews, observations and case studies), which can capture deeper and more nuanced aspects of research engagement. The survey has also raised some questions.
- How do teachers become research-engaged and research-literate? Our survey focused on teachers’ explicit awareness of the sources they were using, but there are many other, often implicit, ways in which research-based information can become embedded in teachers’ professional practice.
- How is research used in practice? We were able to ascertain whether or not teachers said they used research evidence to inform or change classroom practice, but we were unable to explore how the evidence was implemented, adapted and evaluated.
- What is the role of school culture, trust and relationships in supporting research use?
- What is the relationship between teacher research and academic research? How do, and can, these different forms of evidence interact effectively?
We also need better understanding about how teachers ‘blend’ research evidence with other sources of information to create an evidence-informed approach. Our survey considers how teachers’ use of research evidence compares with use of other sources (such as pupil performance data or colleagues’ expertise), but in an evidence-informed school we would expect these sources of information to complement and support each other. This is an explicit objective of EEF-funded scale-up activities, hence ongoing evaluations should provide useful insights into the interplay between different forms of evidence.
Our survey was piloted in November 2014 with a sample of 1,200 secondary and 900 primary schools. Each school was provided with five copies of the questionnaire to be completed by up to five members of staff, equating to samples of 4,500 primary and 6,000 secondary teachers respectively (10,500 teachers in total). We offered a £5 incentive to the first 350 responding teachers. We quickly achieved 509 responses across 256 schools (an average response of two teachers per school and an achieved response rate of just under five per cent of sampled teachers).
The main aims of the pilot were to check the functioning of survey items and to create reliable factors. For these purposes, it was appropriate to draw a random sample of schools from the population of schools in England and it was sufficient to have 300 responses at respondent (teacher) level. We did not need to achieve a representative sample of teachers; just a sample that covered the range of responses required to undertake factor analysis.
On analysis of the data, we realised that the pilot had yielded some interesting results. The EEF therefore commissioned the NFER to explore the findings from the pilot study in order to understand the levels of current teacher research engagement. Recognising that we could not be certain that the teachers who responded to this survey were representative of the entire teaching workforce (the sample was not designed in this way), we calculated confidence intervals. As a result, we can say that, with no sampling bias, we are 95 per cent confident that if we were to collect results from all teachers in England, the results would be within six percentage points of the results presented in this report, where this measure of uncertainty applies to a random sample of schools and of teachers within schools (with bias in the sampling, the true population percentage could lie outside this range).
Centre for the Use of Research and Evidence in Education (2011) Report of Professional Practitioner Use of Research Review: Practitioner Engagement in and/or with Research. Coventry: CUREE (online)]. Available at: http://www.curee.co.uk/files/publication/1292498712/PURR%20Practitioner%20Summary.pdf (accessed 29 June 2015).
Department for Education and Skills (2004) Pedagogy and Practice: Teaching and Learning in Secondary’ Schools (online). Available at: http://dera.ioe.ac.uk/5706/7/DfES%200442-2004G%20PDF_Redacted.pdf (accessed 29 March 2017).
Greany T (2015) How can evidence inform teaching and decision making across 21,000 autonomous schools? Learning from the journey in England. In: Brown C (ed) Leading the Use of Research and Evidence in Schools. London: Institute of Education Press, pp.11-28.
Hargreaves D (1996) Teaching as a research-based profession: Possibilities and prospects. Paper presented at the Teacher Training Agency Annual Lecture, April 1996 (online). Available at: http://eppi.ioe.ac.uk/cms/Portals/0/PDF%20reviews%20and%20summaries/TTA%20Hargreaves%20lecture.pdf (accessed 23 January 2017).
Mincu M (2014) Inquiry paper 6: Teacher quality and school improvement – what is the role of research? In: British Educational Research Association/the Royal Society for the Encouragement of Arts, Manufactures and Commerce (eds) The Role of Research In Teacher Education: Reviewing The Evidence (online). Available at: https://www.bera.ac.uk/wp-content/uploads/2014/02/BERA-RSA-Interim-Report.pdf (accessed 23 January 2017).
Schleicher A (2011) Building a High-Quality Teaching Profession: Lessons from Around the World. Paris: OECD Publishing (online). Available at: http://www.oei.es/formaciondocente/materiales/INFORMES/2011_OCDE.pdf (accessed 29 June 2015).
Sharples JM (2013) Evidence for the Frontline. London: Alliance for Useful Evidence (online). Available at: http://www.alliance4usefulevidence.org/assets/EVIDENCE-FOR-THEFRONTLINE-FINAL-5-June-2013.pdf (accessed 23 January 2017).
Weiss C (1979) The meanings of research utilization. Public Administration Review 39(5): 426-431.