If only… The dangers of positivist bias at the Education Endowment Foundation

Never heard of the EEF? You’ve just proven my point.

Two weeks ago I went to see an invited speaker at the University of Warwick. It was a member of the Educational Endowment Foundation (EEF) team, which is the single biggest funder of research on education in the UK, and was given a £125 million ‘pot of money’ to fund this research in 2011 under the coalition government (c.£90 million spent to date). You might have expected that the room would have been full of lecturers and students from Warwick’s faculties of education and applied linguistics, but it wasn’t.

To my knowledge, of the 11 people who turned up, I was the only educator. The rest were primarily statisticians. And it was clear from the off that the visitor was herself a statistician, talking about RCTs* (75% of their research) and QEDs** (as an inferior option), shrugging off the need for action research or exploratory qualitative studies (the latter only after I’d asked her). Despite this, she didn’t seem to know a great deal about education, teachers, or how to engage the latter in research on the former. The very choice of their term ‘mobilisation’ to frame what they appear to presume to be a top-down, applied science approach to getting teachers to pay attention to research caused me some concern (we’re back to Schön’s ‘technical rationality’; 1983). It appears that they naively presume that once they get the message across to teachers, the teachers will then change their practice. [grimace and breath in sharply]

‘Mobilisation’? Really?

Those of us who work with in-service teachers can confirm, based on both experience and the extensive ‘change literature’, that nothing of the sort is true. Not only is it challenging to get teachers to pay attention to research, but it is far more difficult to get them to make changes based on it – and for good reason. Teachers are not, cannot be, and (my opinion) should not be automatons who simply put into practice what researchers tell them for one of two reasons (if not both): either the very real possibility that research conducted elsewhere may have limited relevance to their individual contexts of practice (the 4 walls of ‘my classroom’); or (perhaps more importantly) because research of the RCT/QED type, generalised over large populations, tends to produce findings that are so generic so as to be of little use even if they were applicable. Read Hattie’s (2009) super-meta-analysis and ask yourself: How much does this really tell me about what I should do in my classroom. It’s useful, but not as useful as it should be.

Research and theories based on research need to be interpreted carefully and sensitively to be of use to us as practitioners. As Dick Allwright once observed, “we have to ignore the purity of the theory to do anything useful. In a sense, it doesn’t mean to say that the theory is useless, but it does mean to say that it has to be compromised severely to become useful” (2009). Another well-known writer on teacher education, Michael Eraut, wrote in 1994 (32), “idiosyncrasy and self-sufficiency are pervasive professional norms. Other teachers may have a degree of credibility but nothing is valid until one has tried it and, by implication, adapted it for oneself.” And if a final opinion of authority were needed, Thomas Guskey, a world leading authority on teacher change, has noted (2003, 383): “It is the experience of successful implementation that changes teachers’ attitudes and beliefs. They believe it works because they have seen it work.” Not because some bloke in a suit comes to do a ‘mobilisation workshop’.

Teaching assistants – a ‘blip’ for the EEF

Now, a quick confession: the speaker had been invited by a statistician friend of mine after one of our conversations in the pub about quantitative and qualitative research among other things. At the time, we discussed an anecdote he presented, one that the visiting member of the EEF team confirmed was true when we asked her. What had happened was; their online toolkit (it looks a bit like an online self-help happiness calculator, but will actually show teachers and school leaders the relative cost to impact ratio of a specific strategy or intervention) had been indicating, based on extensive, credible research, that teaching assistants (TAs) were having a negative effect for lower-achieving learners in UK schools. Understandably, the EEF started to get a lot of negative feedback on this… from schools who felt their TAs were of immense value, so they decided to commission their own research on TAs (Sharples et al. 2015). Because they framed the study differently and looked at the role of TAs differently, their study found that TAs (when used effectively) are beneficial to schools! In the words of Kevan Collins, EEF chief executive, this evidence “prove[s] that when they’re used to deliver small-group interventions, they can have a great impact on pupils’ attainment.” Now, the details of both studies (which interestingly don’t contradict each other beyond the bottom line [£] summary) are of secondary importance here. What is important is that before the EEF were essentially advising schools to get rid of TAs based on robust research, and now they’re not, based on equally robust research, commissioned as a result of input from practitioner knowledge. All of a sudden, we notice that it ain’t the research that’s dangerous, it’s the lack of practitioner input at the EEF combined with their attempt to simplify research findings and feed them to teachers in ways that it’s presumed they’ll find useful. Any school leader who wanted to investigate the impact of TAs would find a nuanced and complex, yet informative, literature, providing they look at both large-n quantitative studies and more in-depth qualitative research. It’s the whole picture, and an understanding of how to interpret it that helps, not the attempt to reduce it to: ‘should I or shouldn’t I?’ (as Ortega tried to do – unsuccessfully in my opinion – in her recent IATEFL plenary). Aside from it being patronising, such essentialisation is dangerous, both for the livelihoods of TAs and, perhaps more importantly, the students they help. I know one such TA very well, and she changes the lives of countless students with learning difficulties every year for the better. Our EEF speaker referred to this teaching assistant saga as “a blip that shouldn’t’ve happened”. Your ‘blip’ = countless lives affected for the worse.

EEF
Cost first: The EEF Teaching and Learning Toolkit

Hypotheses… or guesswork?

Perhaps the thing that concerned me most about how the EEF have chosen to use their ‘pot of money’ is a sense that they’re guessing. Because of their conviction only to support completely deductive studies in which, once the decision has been made, the money is allocated, and it comes back a few years later either with a null (no significance for the intervention) or a significant p-value and an effect size (they aim for 0.2 using Cohen’s d). So either it is, or it isn’t, and with each one, another few hundred thousand quid from their pot disappears. And according to our speaker, quite a few of these studies have returned nulls. Not wastes of money (such evidence is important), but not far off…

Of course, they have ‘expert advisors’, but I suspect they are also quantitative researchers, and listening to our EEF visitor, I got the impression (and I admit this is subjective) that they are basing their decisions concerning which studies to fund on hunches, guesswork, anecdotal evidence (examples were provided), what a post-positivist would call ‘hypotheses’. And this is where the real value of conducting more mixed-methods studies, and more exploratory, large scale qualitative research would come in useful – to help to reduce the guesswork and add more layers to the findings. One high-quality qualitative study could inform a large number of future experimental interventions.

At the point where this EEF representative used the metaphor ‘looking under the hood’ to imply trying to find out what happens in the classrooms, I felt it hard to hold back my anger… no, contempt.

Come and spend some time ‘under the hood’

Long story short: If you want teachers to engage with research and don’t want to risk frittering away a £125 million ‘pot of money’, you surely need to invest more of that fast-dwindling supply into understanding who teachers are, what they do and why they do it. And to do that, you need to work with people whose job it is to understand teachers: teacher educators, both pre-service and in-service. And you need a variety of methods and research designs to combine the qualitative with the quantitative, so that you understand both the big picture and the classroom experience.

I love research – of all types. What I dislike is people who don’t. And judging from the size of the audience that gathered on that day, it appears I’m not the only one.

STOP PRESS:

Many thanks to Mura Nava from EFL Notes for the comment below (after my initial posting) pointing me towards this other recent post by Andrew Old on much more worrying mis-reporting, and mis-interpretation of the research on the complex issue of ability grouping by EEF:  https://teachingbattleground.wordpress.com/2018/04/02/the-eef-were-even-more-wrong-about-ability-grouping-than-i-realised 

*RCTs – randomised controlled trials: the gold standard for quantitative research.

**QED – quasi-experimental designs: less robust, but common in schools where students cannot be randomly assigned to control and treatment groups. Typically whole classes or schools are assigned to treatment or control groups.

References

Allwright, D. (2009). “Exploratory Practice.” Invited Lecture. Warwick University 7th July 2009: https://warwick.ac.uk/fac/soc/al/research/groups/llta/activities/events/past/conference09/dick_allwright/

Eraut, M. (1994). Developing Professional Knowledge and Competence. London: Falmer Press.

Guskey, T. R. (2002). “Professional Development and Teacher Change.” Teachers and Teaching 8 (3): 381–391.

Hattie, J. (2009). Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement. Oxon: Routledge.

Schön, D. (1983). The Reflective Practitioner: How Professionals Think in Action. New York: Basic Books.

Sharples, J., Webster, R. & Blatchford, P. (2015). Making Best Use of Teaching Assistants: Guidance Repot. London: Education Endowment Foundation: https://v1.educationendowmentfoundation.org.uk/uploads/pdf/TA_Guidance_Report_Interactive.pdf

Featured image from this blog cropped and adapted from z Q’s image ‘silence’ at: www.flickr.com/photos/sixthlie/4463280095. Licensed under Creative Commons 2.0.

4 thoughts on “If only… The dangers of positivist bias at the Education Endowment Foundation

  1. only 11 people?!
    although there are a lot of issues with this kind of top down initiatives we should err on giving such initiatives some leeway; however considering your example of the teaching assistants and other criticisms in other areas such as ability grouping [https://teachingbattleground.wordpress.com/2018/04/02/the-eef-were-even-more-wrong-about-ability-grouping-than-i-realised/] and EEF’s reliance on “effect sizes” should their approach need a radical overhaul?
    ta
    mmura

    Like

    1. Thanks for this Mura. This other blog increases my concern greatly. Not only due to the ‘typo’ -0.34 rather than +0.34, but ability grouping is a really good example of how the complexity of teaching can’t be essentialised into a single ‘should I or shouldn’t I?’ take home message. It depends, not only on the age group, context, class size and composition, but also on the lesson stage and activity, at least according to the extensive evidence presented by writers on cooperative learning (Johnson & Johnson, Slavin, Kagan, etc.).

      Like

  2. Great blog, Jason! It takes some reading to get through for the non-researcher practitioner in me – you’ve provided food for thought for when the assignment marking & trainee grading has subsided in a couple of weeks and I have time and space for some AR myself.

    Liked by 1 person

Leave a comment