Impact activity at the School of Education
Improving the educational outcomes of children starting school in five countries
The iPIPS project, on which this Impact Case Study is based, is a monitoring system for the first year of school which has had significant and extensive impact on pedagogical understanding, policy and practice in England, Brazil, South Africa, Russia and Lesotho. Policy changes have included the introduction of assessment at the start of school in England; the creation of new educational instructional material and approaches in Rio de Janeiro, Brazil; the use of data to inform policies in Kazan, Russia; changes to pedagogic practice in Brazil, South Africa, and Lesotho. Additionally, the importance of iPIPS for educational practice and policy has been noted by the Director of Education and Skills at the OECD.
Between 1st August 2013 and 31st July 2020, 42,308 children in Brazil, Lesotho, Russia and South Africa were assessed as part of the iPIPS project and the data used to inform policy and pedagogic practice. Additionally, advice to the Department for Education in England has led to policy-related impact in the current REF period.
iPIPS (international Performance Indicators in Primary Schools) builds upon the success of the PIPS Baseline Assessment, which originated in 1994 by Peter Tymms and was further developed in collaboration with Christine Merrell. The PIPS Baseline Assessment aimed to improve children’s educational outcomes by providing teachers with high-quality, pupil-level information to inform their practice through more targeted provision for their pupils and through feedback quantifying children’s progress over their first year at school. PIPS also aimed to provide policy-relevant information to managers and policymakers to influence system-level developments.
Contact: Prof. Peter Tymms
The Dyslexia Debate and its relevance for policy and practice
Professor Elliott’s research, involving detailed reviews in genetics, neuroscience, cognitive science, psychology, education and social policy, has demonstrated that current dyslexia assessment and diagnosis practices are scientifically problematic. Dyslexia is subject to multiple understandings and interpretations, and the proliferation of a ‘dyslexia industry’ in which so-called dyslexics are differentiated from other poor readers has led to significant problems of equity. He has shown that the assessment of struggling readers should focus upon relevant literacy skills rather than underlying cognitive processes. Instead, tailored educational interventions that have research support should be employed for all struggling readers within a response to intervention framework. Here, all struggling readers are identified, receive early intervention, and resourcing is an ongoing function of their response.
Professor Elliott’s work has received significant international media attention for more than a decade (ranging from interviews on the BBC Evening News, New Zealand Breakfast TV, to Loose Women). His views have resulted in invitations to speak in many countries about how reading difficulties should be identified and catered for, leading to significant impact. In England this has occurred both at Local Authority and individual level, with practitioners (particularly educational psychologists) changing personal assessment practices, interventions, and communications with children, parents, and staff. Many LAs have since published policy documents which cite his research and his close collaborators are now actively working with LAs across the country. Further details can be found here.
In the light of such work, Professor Elliott received the British Psychological Society’s 2020 Award for Outstanding Contribution to Educational Psychology.
Contact: Prof. Joe Elliott
The Pupil Premium Toolkit: Evidence for Impact in Education
The Teaching and Learning Toolkit is an evidence-based resource for schools seeking guidance on improving outcomes for learners, particularly those from disadvantaged backgrounds. It guides the work of the Education Endowment Foundation and its funding strategy for the £200 million being spent over 15 years to reduce inequalities in school outcomes in England. International use is growing with customised online versions for Scotland, Australia, Spain (with a Spanish and Catalan translation), Latin America (Spanish and Portuguese), the Middle East (Arabic) and Cameroon (French). The Toolkit is a synthesis of research evidence from educational evaluations covering 30 different educational topics. It aims to support schools in spending their resources, especially their Pupil Premium allocation in England, more thoughtfully and more effectively. The Toolkit has directly influenced UK Government policy and spending on education and the policy decisions of governments outside England. As the ‘What Works Centre for Educational Achievement’ it has also influenced the development of some of the other evidence centres in the UK. It is now consulted by more than two thirds of all of the headteachers in England. The distinctive contribution of the Toolkit is that it provides estimates of the relative benefit of the impact of different approaches on pupils’ attainment, using effect size as a common metric. It also includes an estimate of the financial costs of adopting each of the different approaches. The findings about the potential cost/benefit of adopting different educational approaches on attainment provides highly valued support to schools.
Contact: Prof. Steve Higgins
See here for more details.
Defending and Improving schooling for disadvantaged pupils
Our research covered between-school segregation, and developed targeted ways of assessing educational disadvantage. We used these to look at the impact of new kinds of schools, like Academies, and of expanding grammar schools. Using randomised trials, we evaluated school interventions intended to improve educational outcomes for disadvantaged learners.
We published widely. One example is an open access paper (2017, just before the election), showing no clear benefits from grammar schools, with immediate wide readership beyond academia (nearing 30,000 downloads). We made presentations to party conferences, parliamentarians at Evidence Week, House of Commons, government departments, user groups, stakeholders, learned societies, teachers and school heads. We published 40+ times in professional magazines such as New Scientist, and had 170+ stories in major newspapers, and on national and regional news.
Our evidence was routinely raised in Parliament by MPs and Lords of all parties to query policies like expanding grammar schools, and to defend against the enforced academisation of schools. Also used by Full Fact, POST, NUT, and in two party manifestoes.
The Social Mobility Commission adopted our evidence on the importance of recognising long-term disadvantage. Schools NE among others used the evidence to defend against unwarranted charges of school failure in the North. This helped stakeholders to petition for greater resources to deal with the higher proportion of long-term disadvantage there. Our evidence helped pressure groups to argue successfully against new satellite grammar schools, and enforced academisation. Our evaluations are promoted by DfE for schools to use as an evidence base to help the catch-up of disadvantaged pupils. Thousands of schools cite our evidence on their websites to explain to parents why they are using particular programmes.
Contact: Prof. Stephen Gorard
Research Centre: Durham University Evidence Centre for Education