Dogmatism and skepticism are both, in a sense, absolute philosophies; one is certain of knowing, the other of not knowing. What philosophy should dissipate is certainty, whether of knowledge or ignorance – Bertrand Russell
There are no references at the bottom of this blog. No bibliography. It’s just a reflection concocted after a rather average Chinese takeaway and an exceptionally good glass of white Rioja. If you’re looking for something a little bit more academic and rigorous, then you probably ought to stop reading now.
You’re still here. Brilliant. Thanks for the vote of confidence. So, what is the problem with evidence? After all, evidence is proof, confirmation, verification, substantiation, corroboration, affirmation, authentication, attestation, documentation; support for, backing for, reinforcement for, grounds for. Nothing wrong with that.
Or is there? Actually, in education, there is. Evidence based practice is all well and good if the evidence – both quantitative and qualitative – is sound, comprehensive and substantive. However, unlike other fields of science, evidence of what works in education remains patchy – the best and most comprehensive bodies of educational research and evidence have come from attempts to synthesise the myriad of small scale research studies. To makes matters worse, some of the most convincing evidence in education is qualitative, and it is promptly rejected by the those who focus, shortsightedly in my view, merely on quantitative data. In education there is no equivalent of the laws of thermodynamics, nor will there ever be.
Why not? Because, in practice, there is an infinite number of variables involved in researching any aspect teaching and learning. I have a personality. So have you. So have your pupils. I have a set of biases. So have you. So have your pupils. And their parents. All of these factors and others ranging from class size to ability profile, from social background to time of day, ensure that what works for me might not work for you and vice versa.
So, is it the case that evidence in education cannot prove, confirm, substantiate or support teaching and learning? The answer has to be an ambiguous yes and no. Yes because it can certainly shed light on the wider practice of teaching and the larger-scale processes involved in learning. But also a cautious no. Because of the innumerable factors at play, the analysis of evidence in education lends itself to a great deal of subjectivity. Hence all the I’m right/you’re wrong arguments.
Hang on, are you saying evidence isn’t valuable? No, of course not. Evidence is really valuable. But is it always relevant? I would suggest that teachers handle the evidence thrown at them with care and that they would be well served by exercising their professional judgment and their right to doubt, which they have earned through their own practice.
I do worry that the recent proliferation in evidence-based practice proponents, whist clearly well-intentioned, are simply reducing teaching to a set of rules to follow, a miracle cure to the disease they perceive us to be suffering from. Do it this way and you’ll be alright, they promise. However, as I hope to have established, teaching and learning are much more nuanced and sophisticated than that.
So, should we be mixing second-hand evidence into a miracle formula for teachers to consume? Or should we be encouraging and empowering teachers to turn the concept on its head and pursue practice-based research instead of research-based practice?
I also worry about the notion of teachers as consumers of evidence. I’d much rather belong to a profession that is evaluative and flexible, rather than dogmatic and intransigent about what works or doesn’t and for whom.
In my mind evidence ought to plant the seed of doubt. Evidence should always be the beginning of a journey, not its completion. If you find the concept challenging, then forget you ever read this and go back to the comfort and safety of your certainties.
17 CommentsSubmit a Comment
Hurrah for sensible debate!
I have often questioned the purpose of academic research. Take a sample; prove it works / it doesn’t work – apply that theory to everyone else. The classic scenario, is from EEF which state: 1-2-1 tuition is the most expensive intervention and has the least impact on student outcomes. In our own school; this is the complete opposite. It is one of least £ outgoings and has the most impact of student outcomes.
I’d highly recommend @BELMASOffice and their events and conferences. Again, full of academic theory from across the world; but @DrMeganCrawford (Chair) is doing ALL she can, to bridge the gap between academic theories and what ‘actually happens’ in the classroom! More and more teachers are joining the group. I’d highly recommend it.
I really like this post José. I could easily be labeled as a convert to research. I do think it has validity, although with obvious provisos, and it is a good starting point. I hope my posts display an openness and critical eye for evidence. I think that teachers should be given the time and the support to be research evidence consumers. Ultimately, however, I think teachers need to become evidence producers, and critical ones at that. I feel duty bound to know what I am doing is working. I like you practice based research focus and simply want more of that. It will require time and money I’m sure.
My only concern is that as a profession we do have very little evidence really to drive our decision making on interventions and CPD etc. (so much is international and in small trials as you state) and therefore we can easily let scepticism towards evidence override everything. I have undertaken some EEF research in my department. It was invaluable in confirming our educated intuition about our daily practice on using feedback. It wasn’t imposed by some shadowy state apparatus (evidence is too often synonymous with data and our lovely Secretary if State), but instead about us looking to see if he time we were devoting to a specific type of feedback worked. It was practice based research in action.
I think the message you put across in your blog is important. Should we ‘question the purpose’ of academic research as Ross states? Yes – we should be critical consumers, but scepticism should not lead to dismissing all research. We have strong habits, biases and intuitions that we need to check and question regularly. The middle way between scepticism and certainty is where we will find evidence to enhance what we do.
Most importantly we should aim to become producers of evidence beyond exam results to validate our own decisions. This does not put into question our professionalism, but instead it reinforces our professionalism. Our school is bidding to become a research led school, where we connect research on a practice in our schools basis so that we have the time and expertise to do it well. I won’t impose that evidence on your school, but I bet it has some value for you in at least evaluating the evidence at the very least.
I have always sworn by the benefits of a good glass of white Rioja. And I must stop paying Ross in used five pound notes- he is def worth a tenner. Seriously, this is a blog which begins to tease out some of the difficulties of educational research, which those of us for whom it is the day job spend hours debating and teaching new researchers. I am heartened by the new interest in research at the minute as seen in the many bottom up conferences. I guess I would want people to be as informed in their research thinking as they may well be in their day to day classroom practice. It’s something we are working on with all our Post`Grad students at Cambridge.
Ladies and gents, first of all thank you for gracing this blog with your comments. In more ways than one. Not everyone, I’m afraid, is as gracious or charitable as you when it comes to responding to challenging issues or pointing out the chinks in my reasoning. Secondly, I sense a great deal of agreement between what you are saying and what I actually think (whether or not I was able to articulate that well enough is another matter). I find it really humbling but also heartening to be in such company. Lastly, thank you for reminding me of BELMAS, which I’ve been thinking of joining for a while but have not yet got round to doing it.
Interesting and balanced post. Intake issue slightly with the ‘there are too many variables’ TTC that you add. This is a popular one for teachers and smacks of both the NIrvana fallacy and relativism.
Students all differ but we know how to get them physically fit. How do we know that? Because they’re all using the same equipment -a human body. The same goes for their minds. Yes there are variations but that is no reason not to use an evidence based approach.
Think about medicine. That’s no science either and yet science can be employed by it to help. But wait….all human bodies are different so how can we give people medicine?
We do it because it’s the least worst solution.
Thank you for your comment. In the blog, I encourage you to read it again, I state that evidence “can certainly shed light on the wider practice of teaching and the larger-scale processes involved in learning.” Nowhere do I say we should ignore the evidence. The comparisons you put forward are wonderful but perhaps not relevant, as they seek to establish and then criticise a position that I do not hold.
For the sake of clarity, I’m not against the use of evidence in education (that would be daft), what I am against is the dogmatism in some sections of the education debate fuelled by evidence which is often not appropriate or relevant to other settings.
I’ve heard people state confidently that “group work is not a valuable classroom activity” because Hattie described its impact size as low (I know Hattie finds this kind of remark both amusing and frustrating in equal measure) or leaping to “direct teacher instruction” is the best – and therefore only – way impart knowledge because Hirsch and Willingham found it so to be. Unquestioning acceptance is not very scientific, I think, or is it?
I suspect that all Hattie, Hirsch, Willingham would prefer that we took their findings, applied them in an informed manner to our own practice and settings and then reached our own conclusions. But that’s just what I suspect.
However, instead of this more scientific approach, we seem to be (I fully admit this is merely my own perception) propounding the use of evidence as the solution that needs to be ingested lock stock and barrel. If you then decide what works and doesn’t work in your own setting, there is always a self-proclaimed pedagogical luminary ready to dismiss your as foolish, ignorant or worse… because the evidence says so.
Thanks for your reply. Perhaps we are talking past each other a bit?
I didn’t say that you were against evidence. I said “your post was balanced and well-argued”.
I just take issue with this bit:
“in practice, there is an infinite number of variables involved in researching any aspect teaching and learning. I have a personality. So have you. So have your pupils. I have a set of biases. So have you. So have your pupils. And their parents. All of these factors and others ranging from class size to ability profile, from social background to time of day, ensure that what works for me might not work for you and vice versa.”
My point was that there is an infinte number of variables in many fields. Language learning for example. And of course they variables matter, but they shouldn’t be overstated and they shouldn’t be used to ‘give up on’ research. I’m not sure you were doing that, you were careful enough to include the word “might”. But medical research happens despite infinite variables as does research on psychology. Language teachers often say “there are so many variables” but actually you have a student X who is trying to learn a language Y and using their brains to do it. Those variables are the most important ones and the effect of the other variables, for all we know, may be very insignificant.
Re-reading what you’ve written and your comments it strikes me that we’re very close in position. We both believe evidence is important. I guess where we differ is how important supposed variables are to the applicability of research findings?
Sorry about the typos but the first comment was written on a phone with fat fingers.
Thank you Russ for the clarification and for taking the time. I really appreciate it. Some thought-provoking considerations there. Thanks for your input.
Although even in medicine there is a recognition of complexity – take chemotherapy for example – it will kill some people and cure others – doctors have to carefully match the type of chemo to the individual and the 100K genome project is an attempt to map out much more carefully the complex differences in the ways that human beings might react to certain medicines. Just as there is no one silver bullet cure for cancer, there is no one certain way to ensure that all children will respond in the same way to your teaching. And while there may be ways to ensure that the body is fit, such a simplistic response to the complexity of the mind is impossible. As Plato said ‘enforced exercise may benefit the body, but enforced learning will not stay in the mind. Therefore avoid compulsion and let your children’s lessons take the form of play.”
I’m not convinced that the post does what it says on the tin – argue the problem with evidence. It proposes a problem with dogmatism – but that is surely something quite different.
Does respect for the evidence lead to dogmatism? Not at all, I would say, because someone who argues from the evidence is also aware of the ability of other people’s ability to construct counter-arguments based on the evidence. Evidence-based theory is provisional and cautious.
The variability of human personality is not an argument against evidence either. Most of us have two legs, two eyes, one nose and brains that work in broadly similar ways. So there is at least the possibility that we might behave and learn in similar ways at least some of the time. And the variability of context and circumstance is a reason why abstract principle needs to be applied differently in different circumstances. But that is true of any branch of expertise. Bridge builders build bridges differently depending on the river they are trying to cross, the sort of traffic they anticipate etc.
The fact that a charlatan door-to-door salesman tries to convince me that there is bulletproof evidence that his quack remedy invariably works is an argument for *more* evidence, not less.
Hi Crispin, thanks for your comment. I take your point. I suppose I started the post with one thing in mind and then one thing led to the other…
I sense we’re on the same hymn sheet though. Your last paragraph chimes in very nicely with mine.
Hmmmm…Is the R word (research) welcomed by the majority of teachers in the same way that learners respond to the G word (grammar) – scary, hard, not for me, best avoided? I think the best approaches are when researchers from Universities combine with front line teachers and work together to find answers to some of the challenges they face in their schools and classrooms. I personally think we should value M-level work written by trainee teachers more – is there an audience for this work? Could it start exciting and challenging conversations and new and relevant debates? Just another thought to throw into the mix.
I do like reading your blog Jose – gracias. x
Great post Jose. Research is just one thing that a teacher’s (or school’s) decision should be based upon. There is also your own data about your students, student/community voice and your intuition/gut-feeling/professional judgement to consider. If all of these are considered then your decision will be sound, forget one or focus too heavily on one and cracks can appear very quickly.
I agree Steve. Over the years I have learnt to trust my own judgment. For example, it turns out that – in areas such as the use of social media – we’ve managed to improve students outcomes despite the available evidence. I didn’t ignore the evidence, but I did question it when it didn’t fit with our experience and developed methods to disprove it. Call me unscientific….
“I’d much rather belong to a profession that is evaluative and flexible, rather than dogmatic and intransigent about what works or doesn’t and for whom” – spot on! Research evidence introduces an important, external, and (comparatively) objective basis on with which to inform practice – but for all the reasons you give, should not be applied blindly nor uncritically.
I also really like Steve Mouldey’s comment “research is just one thing…” – partly because it sounds like lot like the colourful diagram I included in a paper on the topic earlier this year:
http://www.nfer.ac.uk/publications/99942/99942.pdf (see Figure 1)