Blog post
Evidence-based practice in schools: What’s really going on?
Improving educational outcomes through high-quality provision is one of the main aims of education systems across the world. Over recent decades there has been a concerted move by policymakers in many countries to promote more evidence-informed approaches in schools (OECD, 2017), and this has led to a growth in the awareness of using research findings to improve the quality of school provision (Slavin, 2020; Brown et al., 2017).
Despite this greater recognition of the role of research and evidence in schools, research also indicates that it is still infrequently used by teachers to inform their teaching practice (Nelson et al., 2017) and, importantly, there is a lack of good-quality research evidence to tell us how to mobilise evidence into use in education (Gorard et al., 2020). Following the disruption caused by the Covid-19 pandemic, it is now essential for schools to identify strategies and programmes to help learners catch up. School leaders need access to a wider range of good-quality research to help them identify more promising strategies and ensure staff are not burdened with the implemention of unproven or ineffective interventions. Very little research is currently available to assess the range and evidence base for interventions that are being used in schools.
‘Despite a greater recognition of the role of research and evidence in schools, research indicates that it is still infrequently used by teachers to inform their teaching practice, and there is a lack of good-quality research evidence to tell us how to mobilise evidence into use in education.’
Our recent study (Pegram et al., 2022) evaluated the range and evidence base for the interventions used in a cluster of schools (secondary and primary) in Wales. We initially identified 242 interventions in use across the 10 schools. After screening, we conducted a rapid systematic review of the literature for 138 interventions and identified that 42 (30 per cent) had some evidence of positive impact on pupil outcomes, 92 (67 per cent) had no published evidence, and four (3 per cent) had causal evidence to suggest they were ineffective. To assess the quality and trustworthiness of these findings, we used an evidence ‘sieve’ which rates the quality of the research on a five-point padlock scale from 0–4🔒, with 4🔒 being the highest quality evidence (Gorard, 2014; Gorard et al., 2020). Importantly, our findings show that 19 per cent of the interventions used in the cluster had preliminary evidence rated as lower quality (0–1🔒) and 11 per cent of the interventions had positive causal evidence rated as moderate quality and higher (2–4🔒). To find out how access to this evidence base influenced school provision, we presented individualised reports of the findings to each school leader, and returned one year later to re-evaluate the schools’ interventions. Three of the four follow-up schools had made no changes to their provision and continued to use the same interventions. The remaining primary school had made changes to its provision and discontinued use of three interventions.
The findings from this review suggest that the vast majority of the interventions lacked robust evidence of positive causal impact on learner outcomes. Our findings also appear to support previous studies indicating teachers more commonly identify provision using more anecdotal sources (Walker et al., 2019). These findings have important implications for the effective use of funding to improve learner outcomes, especially as schools help more vulnerable learners catch up after the Covid-19 disruption. There is now a moral imperative to enable school leaders to identify more proven interventions and approaches, and researchers and school improvement professionals should work together to provide schools with more accessible summaries of research.
In setting the case for greater systematic use of evidence, White (2019, p. 6) concluded:
‘Most interventions don’t work, most interventions aren’t evaluated and most evaluations are not used.’
Our findings indicate that there is certainly work to do in education.
This blog is based on the article ‘Assessing the range and evidence-base of interventions in a typical school cluster’ by Jane Pegram, Richard Watkins, Maggie Hoerger and John Carl Hughes, published in Review of Education.
References
Brown, C., Schildkamp, K., & Hubers, M. D. (2017). Combining the best of two worlds: A conceptual proposal for evidence-informed school improvement. Educational Research, 59(2), 154–172. https://doi.org/10.1080/00131881.2017.1304327
Gorard, S. (2014). A proposal for judging the trustworthiness of research findings. Radical Statistics, 110, 47–59.
Gorard, S., See, B. H., & Siddiqui, N. (2020). What is the evidence on the best way to get evidence into use in education? Review of Education, 8(2), 570-610. https://doi.org/10.1002/rev3.3200
Nelson, J., Mehta, P., Sharples, J., & Davey, C. (2017). Measuring teachers’ research engagement: Findings from a pilot study. Education Endowment Foundation. https://educationendowmentfoundation.org.uk/projects-and-evaluation/evaluation/eef-evaluation-reports-and-research-papers/methodological-research-and-innovations/measuring-teachers-research-engagement
Organisation for Economic Co-operation and Development [OECD]. (2017). The Welsh education reform journey: A rapid policy assessment. https://www.oecd.org/education/thewelsheducationreformjourneyarapidpolicyassessment.htm
Pegram, J., Watkins, R. C., Hoerger, M., & Hughes, J. C. (2022). Assessing the range and evidence-base of interventions in a typical school cluster. Review of Education, 10(1). https://doi.org/10.1002/rev3.3336
Slavin, R. E. (2020). How evidence-based reform will transform research and practice in education. Educational Psychologist, 55(1), 21–31. https://doi.org/10.1080/00461520.2019.1611432
Walker, M., Nelson, J., Bradshaw, S., & Brown, C. (2019). Teachers’ engagement with research: What do we know? A research briefing. Education Endowment Foundation. https://educationendowmentfoundation.org.uk/projects-and-evaluation/evaluation/eef-evaluation-reports-and-research-papers/methodological-research-and-innovations/teachers-engagement-with-research
White, H. (2019). The twenty-first century experimenting society: The four waves of the evidence revolution. Nature, 5, 47. https://doi.org/10.1057/s41599-019-0253-6