Will #Brexit be a good thing for our students? A personal account post referendum

I didn’t vote. I would have, I just wasn’t entitled to. You see, I’m one of the EU immigrants 52% of the country I’ve lived in and contributed to for 22 years has just voted against. Or at least it feels that way. A few folks, clearly well-meaning, have told me that I shouldn’t worry, that I am the sort of skilled worker that this country is now free to attract (this, apparently, wasn’t possible before). “Don’t worry, you’re one of the good ones”, they tell me.

Patronising remarks such as that aside, the problem is that I am not just a skilled worker whose worth has to be measured by other, better entitled citizens. My British wife, my British children and I are quite literally the fruit of the European Union. I did not move here for economic reasons. I would have lived in Spain just as happily. I moved here because I had met and fallen in love with a beautiful, intelligent and kind English girl, who, taking advantage of the free movement of people and labour, had moved to Spain to study and work for a year. On Thursday, 52% of the electorate voted that the concept that brought my family into existence was not such a good idea after all. For the first time in our married life, the interests of my wife’s country of birth and my own country’s interests will no longer be aligned.

I must admit I was very shocked at the Brexit result. I had wagered £2 on a 10 point win for Remain in our school’s referendum sweepstake (by the way, Remain won with 85% of the vote in our internal student EU referendum). Having lived here for so long, I had convinced myself that the British natural conservatism (with a small c) and common sense would secure a Remain win. More fool me. Instead, the people to whom I had started to relate emotionally as fellow countrymen, decided to cut off their noses despite their faces. And what for?

Cartoon by Bruce Mackinnon
Cartoon by Bruce Mackinnon

Daniel Hannam MEP, the Leave campaigner who desperately wanted to lose his job, has already warned that those who voted leave to cut immigration are going to be very disappointed. Nigel Farage, the former city trader and closet racist MEP for a Little Britain few of my British friends and colleagues recognise, has already admitted that telling people we could divert £100 million pounds to the NHS every week if we left the EU was “a mistake”. And Boris Johnson, who yesterday had to rush past a gauntlet of Londoners – the people whose interests he was representing not that long ago – shouting “scum” as he fled to a waiting taxi, has already said we should not rush to exit the European Union from which he has campaigned so hard to extricate us. The EU’s reply? Leave means leave. Get on with it.

Indeed Boris appeared to be the most surprised of all Leave campaigners at their pyrrhic victory. Consensus to reach a deal to remain members of the single market seems to be gathering momentum among the Leave campaigners, leading one to wonder about the congruity of squaring the concept of having to follow EU regulations without having a say in how they are set with the notion of “taking back control”. As things stand, we appear to have gained so much control that soon we may have a Prime Minister nobody voted for.

And, of course, there’s the British economy, which has slipped to being the 6th largest economy in the world, putting France ahead of the UK in 5th position, losing over £200 billion in 24 hours, which, by the way, is enough to pay 24 years’ worth of contributions to the EU’s membership. Yes, markets go up and down and the economy might perk up. But it might not. Was it worth the gamble? We’ll have to wait years to know the answer, and all the while 48% of the electorate and the vast majority of the people formerly known as experts will be pointing their fingers at the culprits/masterminds of this act of self-harm/liberation and lining up to say “we fucking told you so”. David Cameron’s hopes that this country would stop banging on about the EU seem naive and futile in equal measure today. Sorry to break it to you, but that isn’t going to happen anytime soon, David.

The mood was very somber at the Telegraph Festival of Education, where I was fortunate to have spent the most momentous day in recent British history — a day in which a Prime Minister’s resignation was only the second most newsworthy event. Representative of young people across the nation, Wellington students had voted overwhelmingly to Remain in the EU in their own school mock referendum and felt betrayed by what they saw as the stupidest decision this country has ever made.

In a tableau illustrating British societal division, as I was discussing the repercussions of Brexit with fellow speakers at the Master’s Lodge, a security guard who was within earshot piped up that the had voted out and that we shouldn’t worry, because “everything was going to be better”. It transpired he had voted out to safeguard the NHS, convinced that the NHS was under strain, not because this country voted for cuts and austerity and thus got cuts and austerity, but because of the EU. I wonder whether he has now heard what Farage said about not giving money to the NHS earlier this morning. Or whether that will make a difference to his opinion.

As Project Fear turns into Project Reality before my eyes, as I contemplate my mortgage repayments going up relative to the price of my house, as energy bills and the price of my weekly shopping are set to increase over the coming months, I don’t share his optimism. I hope to God that this chap is right and I am wrong. If it turns out that Brexit is the best thing that ever happened to this country, I’ll be the first to admit how wrong I was.

In the meantime, I will continue to remind people that we had a good thing going; that Britain was already Great; that immigrants contribute enormously to this country; that native Brits were more likely to be looked after by a nurse or doctor from the EU than having to wait to be seen behind someone from the EU; that the strain on housing was due to government cuts, not the EU; that inequality is an endemic British problem and was not caused or exacerbated by the EU; and, most upsetting of all, that my children are going to be denied the opportunities their mum and dad enjoyed and took for granted.

Might new, better opportunities open up for them? They are my children. I really hope they do. I hope with all my heart that the panglossian vision that sees the United Kingdom becoming a beacon of world trade and prosperity comes to pass. I hope that the decent, hard-working folks who have blamed the EU for successive UK governments’ failures and who have bought into the loosely substantiated idea that life would be better for them outside the EU are not the first to suffer the consequences of the economic decline that started 48 hours ago.

Though I remain sceptical and angry about the blatantly deceptive claims spouted by the Leave campaign, most of which are now being hastily detracted, I wish and hope that this country, in which I have invested half my life, returns handsomely on that investment, for my children’s sake. Just in case, however, I have started the process that will allow my children to acquire Spanish, and thus EU, citizenship.

As the British people have chosen to speak with their hearts and not their heads, hope is all we have now.

Overlapping arguments —Why we love to hate technology

Semantics is about committing to a shared understanding of the truth, and the way our thoughts are anchored to things and situations in the world.

Steven Pinker

One of the most common novice misconceptions about learning foreign languages is that words can be translated literally. Beginner students will commonly seek to translate idioms such as it’s raining cats and dogs word by word, causing native speakers of the target language to look up to the sky in horror before realising it’s just a peculiar figure of speech.

Even more advanced students struggle with the different semantic ranges of words across languages. For example, the dictionary tells us that the meaning of to have in Spanish is tener. Which is correct. I should know. However, the semantic ranges of these two verbs do not overlap exactly: the meaning of tener in Spanish is more restricted to possessing or owning, so in Spanish you cannot say I have a cup of tea literally. If you did and said tengo una taza de té, it wouln’t mean that you’re having a cup of tea to drink, it would mean that you have one in your possession. You would need the verb tomar in this context: tomo una taza de té – I take a cup of tea. And the whole thing starts again as we realise that the semantic ranges of to take and tomar do not overlap exactly.

This kind of partial semantic overlap happens commonly across different languages. As fluency increases, language learners realise that meanings can stretch in some cases and compress in others and that simply knowing the first dictionary definition of a word is just not good enough for proficient usage — you also need to be aware of the overlaps, adding a layer of complexity that is quite simply invisible to the novice.

The world of science is perceived to be much more clearly demarcated. There are fewer overlaps and so research findings are often presented in binary form – something worked or didn’t, something was effective or wasn’t; something had an effect, by this much. Science is precise, which is one of the great things about it.

However, delve deeper to contextualise and interpret these findings and overlaps begin to appear. It is not uncommon for folks to appear to be disagreeing about something until they realise they are comparing apples with pears. The argument usually begins because there is an overlap in meaning – e.g. apples and pears are both fruit – but otherwise they are two entirely different kettles of fish (not to be translated literally).

Want better grades? Avoid technology

Last week a report was published that appeared to show – yes, you guessed it – that technology was, not only of very little use to students, but also that it could cause students to receive worse grades. Crikey. The study was a randomised controlled trial (RCT) run by West Point, the famous US military academy. Since RCTs are the research gold standard in science, these findings were immediately given a great deal of importance and quickly started making headlines in the press here in the UK.

As far as I can tell, the West Point study was well conducted and it presented its findings with remarkable accuracy. There it was, in black and white and numbers with decimals: technology was no use in the classroom.

Yet, despite the findings of this study I sustain that technology can be helpful in the classroom. Am I a science denialist? A technology zealot? I’d like to think I’m neither. As a teacher, my only interests are great teaching and learning – with or without technology. Yet because I’m not so quick to jump on to the give-technology-a-slap bandwagon (note to translators: this is a figure of speech) folks often assume I have vested interests beyond providing my students with an outstanding education. They’re wrong, of course.

But is the study West point study wrong also? No, it isn’t wrong. The study looked into the effect of allowing students to use technology without teacher guidance, allowing them to do as they pleased with their devices when they were allowed to use them. Most unsurprisingly, the study found that when this was allowed to happen, technology was a distraction and more of a hindrance than a help, which in itself is a useful finding. I say finding – probably every teacher who has ever used technology in a classroom had already found this. This wasn’t lost to begin with. But now there is an RCT to refer to, which is great evidence, isn’t it.

To be fair to the authors, in the final paragraph of the study’s conclusion, they are at pains to highlight this:

We want to be clear that we cannot relate our results to a class where the laptop or tablet is used deliberately in classroom instruction, as these exercises may boost a student’s ability to retain the material.

And there is the semantic overlap. The apples and the pears. The study did not look at the impact of technology when it was being used deliberately by teachers to support strategies that have been found to support teaching and learning, such as retrieval practice, improving the frequency and quality of feedback, collaborative practices… which leads leads me to…

Why do teachers hate technology?

The answer of course is, of course, that they don’t. Teachers are busy pragmatists, they will adopt and embrace anything that makes their life easier. Technology makes their life easier, this is why they use it. If it didn’t, they wouldn’t, as proven by the fact that when it doesn’t, they don’t. Perhaps we need an RCT to get to the bottom of this.

Daniel Willingham’s book Why do students’ hate school? is a great read, full of the sort of practical advice and guidance that is most useful to teachers. But I have often wondered why he felt compelled to use that title. It baffles me because, in my experience, students generally love school: the learning, the friends… But never mind that, earlier this week, Willingham weighed in to the technology in schools debate with an opinion piece titled The false promise of tech in schools: Let’s make chagrined admission 2.0.

I immediately found myself in agreement with Willingham’s conclusion:

A solution might be to adopt new technologies alongside traditional methods; in time, researchers can evaluate whether both are needed. That’s how calculators were integrated into math classrooms.

This is indeed what we are doing at the school in which I work and what happens wherever else technology is being used effectively to support the processes involved in teaching and learning. But I did think it was a shame that Willingham used tired old arguments to reach that conclusion, e.g. Google means you don’t need to learn stuff; using technology means no more hand-writing; and, of course, technology is changing our brains.

Which is all rubbish. Of course kids still need to learn stuff, despite Google. Of course hand-writing is important: all students use a tablet in my school – hand-writing is, if anything, thriving. And the brain-changing digital natives myth died and was buried many years ago. Yet here is Willingham repeating the moribund mantra that supporters of tech think Google means learning is obsolete. Here he is using ten year old myths to pass judgement on the use of technology today, even though today it is much more likely that the use of technology in schools be much more clearly governed by a solid understanding of sound pedagogical principles and not so much by happy-clappy evangelists of the new.

Is Willingham wrong then? Nope. All the things he refers to have happened. There’s the overlap. But he ain’t right either, because there’s also where his thoughts are anchored, to echo Pinker’s words. To be right, he would need to stop looking at how technology was being used ten years ago and take a good look at how it is being used today. Our doors are open. And that’s not just a figure of speech.

So there you have it. Those who say technology doesn’t work are right. And so are those who say it does.

Your thoughts, as always, are most welcome.

Does our #edtech obsession get in the way of education? – On experts, zealots and knowing better

Language shapes the way we think, and determines what we can think about.

Benjamin Lee Whorf

Education is complex field. In it there are folks who specialise in behaviour management, assessment, professional development… And so we have assessment experts, CPD experts and even government appointed behaviour tsars. Hardly anyone would refer to these folks pejoratively as being obsessed with their chosen field of specialism. Oh, but not educational technology. If your field of expertise happens to be educational technology, I have news for you: you’re not an expert, you are a zealot. Get used to it. If your interests lie in finding out about how digital technology can support teaching and learning, this is not a legitimate pursuit, it is an obsession and so you should find a good psychiatrist.

Hyper-puppy evangelists of the new

There are various reasons for this perception. Firstly, we need to consider that digital technology has only made it into our pockets in the last ten years or so. My youngest son is only six years old but even he is only six months older than the iPad. Mobile technology for academic purposes is almost literally still toddling its way into our classrooms. The natural conservative – with a small c – approach for many of us is to stick with what we know best, which in most cases is not digital technology. I find this very understandable.

Secondly, there is the gaping, self-inflicted wound of unreasonably high expectations borne out of the promises made during the “paradigm shift” years, when we were assured that the 21st century would “change everything”. You see, “the 21st century changes some things, but quite a lot of other things will remain the same” wouldn’t sound quite as alluring and punchy over melodramatic music. But the revolution never really happened. Like a child high on sugar, it bounced about like crazy for a while only to suddenly fall asleep in an awkward position when the fuel suddenly ran out. John Hattie, he of the effect sizes, is fond of saying “technology evangelists have been promising a revolution in education for the past thirty years. I am still waiting”. And who can blame him.

Thirdly, there are the “hyper-puppy evangelists of the new”, memorably and incisively described by Tom Sherrington: “It is all too easy to be dazzled by bright new shiny things – the latest fad or gizmo that is going to change everything” says Sherrington. “Teachers are often deeply resistant to being sold things – it happens too often; they’ve learned to be cautious. It is a giant cringe to listen to someone rave about their new idea when they appear to be all Enthusiasm and no Substance.” Amen to that, Tom.

A conundrum

But the the conondrum is that, despite all of the above, technology remains helpful. That’s why we use it. All of us. For a variety of purposes. Of course it’s not always helpful, that goes without saying, but it’s helpful sometimes. To some teachers more than others. In some some schools more than others.

Yet, at some point many folks have decided they’ve had enough of hyper-puppy evangelists and 21st century this that and the other, and, instead of taking a pragmatic approach as to when technology works, for whom and for what purpose, they appear to have eschewed technology altogether. The final solution.

And so we have celebrated educationalists in this country who are on record as saying that children would be better off if we “turned all the screens off”, that tablet computers only encourage children to “surf the net and look for photos of Kim Kardashian” or that they “don’t need any technology in their classroom”. In a bizarre and completely befuddling trend, it’s as if one’s expertise in education were directly proportional to how vociferous one was in repudiating technology. The less technology you use, the better teacher you are and the more you learn. Why? Because <insert pseudo intellectual nonsense and cite technological dystopia>.

The thing is that technology is nothing more and nothing less than “the application of technical knowledge for practical purposes”. This knowledge is helpful. Instead of proclaiming the virtue that apparently derives from forswearing technology – as if academic rigour and using computers were somehow antithetical – wouldn’t we be better off by remaining open to the notion that using technology, in certain circumstances, may actually contribute to improved teaching and learning? Wouldn’t it be a good idea to develop teachers’ expertise so that they are able to make discerning use of whatever technology may be most helpful at any given time for any given purpose?

So, I ask you: who is really letting their technology obsession get in the way of education? Is it the schools exploring how mobile technology can potentially support teaching and learning? Or is it those banning it outright? Is it the teachers researching and developing pragmatic strategies so that technology can be applied for practical purposes in their contexts? Or is those who chant just-turn-it-off inside their echo chambers and deny technology’s utility altogether because, don’t you know, “there is no evidence”?

You decide.

Photo credit

A textbook problem: Seven suggestions to improve the quality of published resources Exploring the cognitive principles that make a good textbook

In this article, I will explore ways in which we can make textbooks and other resources better for both teaching and learning in the light of a growing body of research in the field of cognitive science. However, I will not consider on this occasion the intrinsic value of textbooks (i.e. should we have them at all?) or the legitimate concerns arising from the influence that textbook publishers appear to have on government policy. Instead I will assume that textbooks can be a valuable resource for both teacher and student, even if only used occasionally, and therefore I will focus on providing practical suggestions for textbook design to maximise the chances of effective teaching and learning occurring when they are used.

The status quo: what is a textbook?

The textbook definition is “a book used as a standard work for the study of a particular subject”. As such, textbooks come in a variety of guises and formats, depending on the subject and age range. Most textbooks are printed on paper, which remains an excellent medium, but an increasing number of publishing houses and, notably, self-publishing practitioners are using digital media (e.g. epub, iBooks, web pages…) to deliver content and contribute to effective instruction and successful learning.

Whatever the case, in secondary education, textbooks typically…

…introduce new topics
…show suitable illustrations
…present topics in blocks
…which can encourage massed practice
…provide problems to solve
…promote independent study
…provide extra resources for regular assessment of learning

Textbooks differ in quality and some are much better than others, so not all textbooks will fit this bill. But the likelihood is that you probably recognise most or all of these characteristics in the textbooks you use. Since some approaches work better than others, it is reasonable to consider what we know about effective instruction and about how students learn best in order to improve how textbooks support teaching and learning.

 

What makes a great textbook?

These seven suggestions for improving the efficacy of textbooks are based teaching and learning strategies that have been shown to improve outcomes for students.

1. Introduce new topics by referencing to what the learner already knows

Many textbooks introduce new topics by making reference to learning objectives and then dive in to whatever new topic the chapter introduces. Since research shows that better learning occurs when students build on prior knowledge, my first suggestions would be to start chapters with activities that require students to recall and, in a sense, to activate prior knowledge, thus strengthening the connections between existing knowledge and the new concepts about to be learnt.

Activities that require retrieval of prior knowledge or that otherwise help make connections in the students’ minds between what’s already been learnt and what needs to be learnt should preface every new topic. Careful hyperlinking to previous content, multiple choice quizzes, cloze exercises or vocabulary tests are all easily embedded into digital resources to support this principle.

2. Pairing graphics with text

Clearly textbooks should be aesthetically appealing. We would be wise not to ignore affective factors that could influence negatively a learner’s disposition to learning before it has a chance to occur. Although stereotypically we tend to determine academic rigour to be in a negative correlation with the number of illustrations, it is possible to produce textbooks that are both appealing and supportive of effective instruction.

My second suggestion would be to eschew superfluous illustrations, which in any case often contribute to the textbook becoming dated prematurely, and focus on pairing text with graphics that will support learning by presenting examples and depicting overarching ideas or concepts and explaining how these ideas and concepts connect. Well designed graphic illustrations depict models clearly, represent abstract concepts and reveal underlying knowledge structures that will help learners make the required connections to take learning further.

In digital resources, graphics can literally come alive, which can be very useful, though it is important to keep animations simple so that they do not become a distraction in themselves. Carefully chosen video clips can also be embedded (or linked to from a paper based resource, using, for example, a QR code) to provide examples and facilitate conceptual understanding.

3. Interleaving different but related topics and skills

Interleaving is the practice of alternating different topics and types of content. Although intuitively we feel that we learn better by focusing on one topic or skill at a time, research shows that better learning is achieved when students interleave different but related topics or skills, rather than focusing on one topic or skill, then another topic or skill, and so on.

Although the illusion of better learning is achieved by studying topics in blocks, it is actually by interleaving topics and skills that long term retention and greater overall understanding are achieved. This would be very counterintuitive for publishers of content, as many teachers and students might find it confusing (and therefore feedback negatively) if a chapter, instead of focusing on one topic at a time, as it is the norm, alternates between related topics and skills as it seeks to connect to and build on existing knowledge.

Students and teachers may find this approach less neat and more messy, but research shows conclusively that interleaving leads to better overall learning in the long term. Once again, careful hyperlinking between related topics can support the interleaving of key topics and concepts if a digital format is being employed.

4. Encourage distributed practice

Closely related to the principle of interleaving of topics and skills, distributed or spaced practice is based on the fact that learners remember information better when they are exposed to it multiple times throughout a course. Textbooks generally adopt a modular structure: study one topic, assess it, move on. Job done. Good luck for the exam.

In linear courses (such as IGCSE and the new GCSE and A level), which typically last two years, it is conceivable that a topic that is covered during the first term of the course is never returned to before a hastily arranged revision session just before study leave. Although teachers can claim that the topic has been covered — it has — they can’t claim to have covered it in a pedagogically sound manner unless they have ensured the topic has been studied more than once during the teaching of the course.

Textbook publishers can facilitate distributed practice by structuring the content so that students are exposed to key topics and concepts more than once and by building in review opportunities weeks and even months after new knowledge is acquired.

5. Modelling solved problems

Modelling is a very effective classroom strategy. Textbooks too can make the most of the powerful effect of modelling by alternating problems with written-out solutions, worked examples (i.e. where the steps to achieve the correct solution are laid out) and problems that the student needs to solve independently. This is also a kind of interleaving.

This approach ensures that students become familiar, not just with the mechanics of problem solving, but also with the underlying principles required to master the topic in question. The student can then be guided to more complex but related problems or questions and, as the students become more proficient, the textbook can begin to increase the number of problems or questions for the students to solve or answer independently.

There are probably many textbooks that already take this or a similar approach occasionally, perhaps to help with particularly tricky concepts, but few structure their exercises and tasks in this way from the outset.

6. Teach independent study skills to boost metacognition

Although many textbooks promote independent learning by, for example, pointing students to additional sources of reading, relevant websites, video clips, films or TV programmes, few actively seek to teach specific metacognitive strategies to help students become better learners in a particular subject. The view could easily be taken that, say, a French textbook’s purpose is to teach students French, not to teach students how to learn, which is the essence of metacognition in this context.

This view would seem entirely justifiable until one considers the important contribution that metacognitive strategies bring to successful learning. For example, research suggests that encouraging learners how to plan, monitor and evaluate their own learning by providing subject specific strategies and guidance has great impact on learning. Textbooks could interleave activities in which students are asked to identify where a task might go wrong; to lay out the steps required to achieve mastery of a topic; to produce their own worked examples, or to formulate appropriate questions and provide possible answers.

7. Frequent assessments for better retention

My final suggestion deals with assessment and how it is generally used to determine the extent to which a student has learnt the required material. In another counterintuitive turn, it turns out that frequent assessment is more helpful to the learning than it is to the assessing, that is to say, determining the extent of learning.

Many textbooks already come with supplementary assessment resources, usually in a separate pack, which sometimes needs to be purchased separately. In more than a few cases assessment is clearly an afterthought for many publishers. These assessments also come in the form of high stakes end-of-unit or end-of-module tests and end-of-year exams.

Given the unequivocal nature of the research that suggests that frequent retrieval practice boosts retention, my suggestion would be for textbooks to encourage frequent retrieval practice by design through low stakes or no stakes testing and quizzing, whereby testing and quizzing are a part of the learning process, not just the assessing.

The implications of this for digital resources are enormous. There are many software packages and digital publishing tools (e.g. Apple’s iBook Author) that facilitate the inclusion of frequent retrieval practice opportunities. Even if the textbook is primarily paper based, publishers could consider linking to dedicated web pages where learners can self-test and self-determine where they are in their learning and how to improve.

This is obviously not an exhaustive list and I will have missed out some important suggestions. I am also unfamiliar with any overarching factors, such as cost, that may cause publishers to pursue a certain course of action over another. Clearly, although I am an experienced teacher, these are the suggestions of someone with little publishing experience and should be taken as that.

This Twitter thread contains further suggestions to publishers from teachers. If you have any further considerations to contribute or have any comments about the points I have made, please do not hesitate to add to the conversation, below.

Plenary

Pick the correct option:

Sources and further reading

Brown, Peter C., Henry L. Roediger, and Mark A. McDaniel. Make It Stick: The Science of Successful Learning.

Deans for Impact. The Science of Learning.

Education Endowment Foundation. Meta-cognition and Self-regulation.

Pomerance, L., Julie Greenberg, and Kate Walsh. Learning about learning.

Photo credit

Is Google teaching us anything? — Exploring the concept of "knowledge nihilism"

In his book The Shallows, Nicholas Carr suggests that the internet is making us dumber. Carr finds that the vast amount of hyperlinked information available on the internet means that depth of knowledge has given way to shallowness. Casually disregarding the internet’s arguably most significant feature, Carr asserts “people who read linear text comprehend more, remember more, and learn more than those who read text peppered with links”. To Carr, the internet is a distraction from proper learning.

In the other extreme of this continuum, we have Sugata Mitra, to whom the internet conjures up revolutionary new ways to acquire knowledge. Controversially for many, Mitra claims that “people are adamant learning is not just looking at a Google page. But it is. Learning is looking at Google pages. What is wrong with that?” To Mitra, the internet is learning’s best hope.

Even though I personally disagree with Carr’s conclusions, I can recognise that there is some truth in his warning that the internet has changed the way we access and process information. As there is also some truth in Mitra’s rather uncontroversial claim that, given the right circumstances, children can learn by themselves. Neither Carr nor Mitra have the answer. They have an answer.

But the truth is that both Carr’s often myopic evaluation of the utility of the internet and Mitra’s hopelessly Panglossian vision of learning in the 21st century can be challenged easily, thus they become easy targets. Read this thoughtful critique of The Shallows by Jonah Lehrer in the New Your Times or this denunciation of Mitra’s work by EdTech entrepreneur Donald Clark.

Knowledge nihilism

In an excellent article Benjamin Riley, founder of Deans for Impact, explores the practical implications of scientific principles which we know contribute to improved teaching and learning. Towards the end of the article Riley takes a turn and cautions against “the growing danger posed by what I call knowledge nihilism”. Riley continues “the proponents of knowledge nihilism believe knowledge itself is overrated. In an era of proliferating technology that lets us access information at speeds unimaginable even a few years ago, they believe students no longer need to know facts or understand procedures. After all, why teach it when they can Google it?”

This caused me to pause for thought. Not about the practical application of scientific principles with the aim of improving pedagogy, but about the knowledge nihilists against whom we teachers were “the last line of defence”. In a brief Twitter conversation I suggested to Riley that perhaps he was overplaying the influence of knowledge nihilism and that his arguments were strong enough to stand on their own merit, without recourse to a defence from a largely imaginary adversary. Riley did not buy it and claimed that “the notion you need not learn googlable facts is widespread” and, unsurprisingly, Sugata Mitra was wheeled out as the epitome of knowledge nihilism, which is fair in a sense, given Mitra’s famous refrain “knowledge is obsolete”.

But how fair is it to suggest that the notion that Google negates knowledge acquisition is widespread? As a teacher of 13 years and experience in various schools, working alongside hundreds of teachers in both the state and the private sector, I have never come across a school or teacher for whom knowledge acquisition wasn’t the top priority.

Perhaps Riley is right and this belief is indeed widespread, as he claimed. Perhaps circumstances have sheltered me from this reality, so I decided to poll twitter on this question:

At the time of writing, over 1000 people had answered the poll and 89% (this may change as more people take the poll) of respondents had agreed that it was indeed still necessary to acquire knowledge and learn facts at school notwithstanding the existence of Google. You may criticise the poll as unscientific (it is!), the question as ambiguous or the choice presented as a false dichotomy because the respondent is presented with a binary choice and nothing in between. Nevertheless, assuming that all respondents are indeed teachers, the results show that the claim that knowledge nihilism is widespread seems like an exaggeration, at least among teachers.

Knowledge is absolute

The response to the poll was quite entertaining. What does he mean? I can’t believe he is asking that! What is his agenda? What kind of rubbish question is that? And quite understandably.

Tom Bennett, director of researchED and chair of the UK government behaviour group was appalled that a relatively small number respondents had answered “no, it isn’t necessary”. Perhaps a tad dramatically, Bennett told me “terrifyingly 13% (the poll results stated 13% at the time) actually do support teaching no facts. Think about that. That’s like doctors not healing”.

Others entertained a discussion about the role of Googling things up. Carl Hendrick, Head of Learning and Research at Wellington College suggested that “Google can be useless. Giving someone a German dictionary doesn’t mean they can speak German”. Nick Dennis, Deputy Head at Nottingham High School, was more optimistic about the role of technology in general and asked “how many have gone beyond what they knew because of the tech and organised events to discuss ideas as a result? How many academic papers have been shared as a result?”.

And finally there were those who, rightly, queried the polarised nature of the question. Yana Weinstein, co-founder of the very excellent Learning Scientists, remarked “I think it’s a continuum. Maybe [some] think knowledge is less important to acquire”. Weinstein is spot on. This is where the real debate is. Or at least where it ought to be.

Thankfully, I think, relatively few educators ascribe to the knowledge-is-obsolete-because-Google school of thought in its more extreme interpretation. The interesting thing to debate is not whether the internet renders learning stuff unnecessary, but how the internet impacts on learning. To take Hendrick’s dictionary analogy further, giving someone a German dictionary does not mean that they no longer need to learn German, but it does mean they can better learn German.

Does Google have value?

It seems to me that education is very much like owning a smartphone: you don’t realise you need one until you get one. But I’m not going to enter any controversy about what the purpose of education ought to be. This is not a question that science can answer. To answer that question we must interrogate our values. Although we may discuss and debate the finer points of how knowledge is best acquired, few of us would deny its value. As we discuss and debate the role of technology in education, I would like to think that we can move on to a debate in which technology is not presented as the opposite of learning. In my view, it is reasonable to find learning how to use technology to support further learning valuable, not because we don’t need to acquire knowledge anymore, but because we still do.

There are few true dichotomies in education. Most issues are debatable, equivocal and happily settle on the Aristotelian golden mean. But here there may be one true dichotomy: the internet either helps you learn or it doesn’t. Make your choice and teach children accordingly.

Photo credit

What #Postergate tells us about dignity and honesty – Why we're never more than one disagreement away from howling in the trees

Seldom, very seldom, does complete truth belong to any human disclosure; seldom can it happen that something is not a little disguised or a little mistaken.

Jane Austen, Emma

Research supports tacit teacher knowledge that a great lesson starts with an initial review of prior knowledge. Recently I had the privilege of observing an excellent Year 7 Geography lesson in which the teacher opted to start the lesson by handing out A3 sheets and asking the students to fill in the different sections in groups, which were already pre-printed of the sheets.

The task took approximately 10 minutes (our lessons are one hour five minutes) and watching those 11 year old students recalling their knowledge about the topic as they discussed what they had learn in previous lessons would have left anyone without any doubt about what these children knew and about the effectiveness of this task as a starter activity.

On another occasion, I observed a Year 12 Psychology lesson, in which the students used a brochure – complete with relevant illustrations and diagrams – that each of them had created in advance (I believe it had been a homework task) about the salient aspects of the topic they had been covering. This was used very effectively to support classroom discussion, which was expertly guided by the teacher, who teased out explanations, counter explanations and generally caused students to think really hard. It became clear that this brochure was clearly being used by students as a revision aid outside of lessons as well.

On both occasions, the product was, to all intent and purposes, a poster. But not every one likes posters, and that’s fine.

A few days ago the Evening Standard preambled a piece about Tom Bennett thus:

Schoolchildren who spend lessons watching DVDs, designing posters and doing “group work” are not being taught properly, the Government’s behaviour czar has warned.

Tom Bennett, the Department for Education’s discipline expert, said some teachers fill lessons with pointless activities that keep children busy but do not constitute proper teaching.

The piece itself was typical tabloid clickbait, so it was more concerned with attention-grabbing headlines and counting page views than content, but it drew heavily from a more measured TES piece:

In my career I’ve done plenty of [DVDs], I’ve seen many more of them, and it took me years to see through the candy floss and the tinsel. Some things just aren’t teaching; they’re activities that, yes, generate heat and light, but offer no warmth or illumination.

Posters often fall into this category. “Do a poster,” says the teacher. And an hour trickles away and bubble writing happens, and someone back shadows the title, and four sentences are written while someone else cuts out pictures.

What constitutes proper teaching for Bennett is clear to all (clue: it isn’t posters or group work). But at least he has the decency of covering his modesty with the fig leaf of qualifiers such as some and often, leaving himself just enough wriggle room, in typical Bennett fashion, to claim he is being misinterpreted if anyone takes issue with any of his pronouncements.

But none of this bothers me. Everyone is entitled to an opinion, and, to be honest, Bennett’s is more informed than most – though as biased by his own experience, beliefs and values as everyone else’s. From an anthropological perspective, Bennett might not be far from the monkey howling loudly from the tree tops to assert his dominance, with a chorus of subservient monkeys howling in allegiance. Meanwhile, in a nearby tree, another troupe of monkeys begins to howl in opposition.

The indignation

What does bother me – and here I risk straddling the saddle of my own high horse – is how we tend to react to this. On the one hand, I understand some of the reaction: Bennett doesn’t just represent his own opinions anymore. He is perceived as a government representative – the government´s behaviour expert – so his opinion is seen as having far greater reach and more implications than if he were just a random blogger, as he once was. As am I. This can cause alarm, worry and indignation among those who regularly orchestrate successful learning using the activities he often condemns so flamboyantly.

On the other hand, although many folks managed to express their disagreement gracefully and thoughtfully, sadly not all of the reaction he received was friendly criticism, as I hope this is. Much of it was vile, personal and, frankly, disgusting. From teachers. To other teachers. There is something deeply wrong with us if this how we confront those with different opinions, however wrong we deem them to be.

The indignation about the indignation

But then there was the equal and opposite reaction. There were the revenge personal attacks. There were those who saw criticism of Bennett’s views as evidence of the progressive illuminati that have so damaged education in this country. There were the calls to common sense and honesty. “It seems it is impossible to discuss bad practice without causing offence” someone tweeted.

A bad workman always blames his tools.

Anon.

But let’s be honest. If Bennett had any real interest in discussing bad practice and activities that generate no “warmth or illumination”, he would have also tackled the often woeful attempts at direct instruction, with explanations so unclear they explain nothing nor lead anywhere; or the hours spent on pointless worksheets that help no-one and whose only purpose is to fill lesson time. But he didn’t. Instead Bennett sought to associate certain activities with poor practice deliberately, clearly forgetting that everything can be done badly.

As a measure of the hypocrisy of which we are all guilty, take the Year 7 posters and the Year 12 brochures from the opening paragraphs and don’t call them posters or brochures, call them knowledge organisers instead. Honesty and dignity, it appears, are the first victims in this war of attrition between folks who, frankly, should know better. It never ceases to amaze me that, despite the veneer of art, science, education and civilisation, it often seems we’re never more than one disagreement away from howling in the trees.

So, instead of tackling the issues, we seek to howl louder. We bare our teeth.  We resort to name-calling and ridicule. We find labels for each other. The metonymic substitution of the complexity of a person with a single characteristic is the kind of linguistic shift that historically precedes genocidal mania – one is not a person but a jew, a muslim or a foreigner; one is not a teacher, but a progressive or a neo-trad. This is the divisive language of hatred, and it’s not for me, thanks.

The problem with #edtech debates —Technology isn't always the solution, but isn't the problem either

Man has turned his back on silence. Day after day he invents machines and devices that increase noise and distract humanity from the essence of life, contemplation, meditation.

Pseudo intellectual guff

Listen to his royal highness Tom Bennett bad-mouthing iPads in his likeable Scottish lilt. Hear the knowledgeable David Didau proclaiming confidently that the only tech we could ever possibly need in our classrooms is – at a push – a visualiser. Read the always thought-provoking Martin Robinson inveigling you eloquently into turning off everything that can be turned off. Or follow the numerous trendy twittering trads who profess 140 characters at a time that great teaching and effective use of technology in teaching are somehow mutually incompatible.

It’s easy to conclude, isn’t it, that it has become almost faddish to espouse anti tech sentiments of late. As if the more vociferously anti tech one is, the greater expertise in education one is seemingly able to demonstrate.

The debate (from dis- battere, Latin for fighting the opposite) around the use of technology in education is often reduced to binary nonsense. One’s either a happy-clappy, app-smashing, iPad-wielding technology zealot or a technology-eschewing, chalk-wielding, iPad-smashing Luddite. If, like me, you can see opportunities as well as the challenges and are neither in one extreme nor the other, this debate is probably boring you to tears and has been for some time.

Nevertheless, think what you may, this debate is worth having. You see, to suggest that technology changes everything is just as daft as to suggest that it changes nothing. So it is only by having this debate that we expose the unreasonableness of dogma and the foolishness of positioning oneself in the extreme.

But let’s make it an informed, intelligent debate, devoid of pseudo intellectual guff whenever possible. Yes, children still need to learn stuff, despite Google. No, the internet is not making us stupid. Yes, digital natives are a myth. No, social media does not cause us to be ‘alone together’. Yes, we do use a lot of technology. No, relying on technology is not an ‘addiction’. Yes, great teaching doesn’t require technology. No, great teachers don’t eschew it. Yes, interactive whiteboards are expensive and often under-utilised. Yet I would like to keep mine, thanks. Yes, iPads can be a distraction. But no, they don’t have to be.

Technology isn’t always the solution, but isn’t the problem either. Let’s have an informed debate. Over to you.

Tablets in Schools Case Study in Success

1cspz6-p3x8_ljm-sqteggThis is a transcript of the seminar I delivered at BETT in the School Leaders’ Summit on Friday 22 January 2016. “Case Study in Success” was the title chosen by the organisers.

My name is José Picardo. I am assistant principal at Surbiton High School (SHS). Our school is currently in the third year of a 1:1 programme after we decided to give all our students a tablet. Mobile device territory is a bit of a minefield. Depending on who you speak to, we are either incredibly innovative or veritably foolish: some people think it is a great initiative, many disagree altogether with the notion of mobile devices in the classroom, while some others disagree that tablets are the right mobile device for the classroom — for some folk mobile devices just have to have a keyboard. So I never know if I’m going to get a pat on the back or booing and hissing.

One of the things I want to do today is talk a little about our journey at SHS and about what has happened in the last three years. But the most important aspect of my talk will be to show you what a 1:1 tablet environment actually looks like.

So later on I will be showing you a three minute video filmed by one of my colleagues, with actual lesson footage, painting an accurate picture of what a 1:1 classroom looks like. We find people we speak to don’t really know what students actually do on their devices. On the one hand, many assume students are going to be on social media and distracted from learning, not learning anything. This is clearly not very rigorous. On the other hand you have people who assume that when tablets are introduced, students will use them exclusively. I would like to think that common sense would tell us all that neither of these thing actually happen and that there is a very productive Goldilocks zone in which teaching and learning can thrive, develop and improve in quality.

How many of you have spotted the guy standing in front of the bookshelf? (cover picture, at the top). There is a guy standing there. His name is Liu Bolin. He’s a Chinese performance artist who gets his assistants to paint on him, so that when he stands in exactly the right place, you can’t see him. He blends into the background.

Why am I showing you this picture? The reason is because it encapsulates very neatly what our vision for the use of technology at SHS is: we want technology to be invisible. Just like electricity. Like running water. We don’t want technology to be a big deal. One of the things that people notice when they visit our school is just how ordinary and normal having tablet is. It’s just there for when its needed. It’s used in many lessons, but not in all lessons. When it’s used, it’s not used throughout the lesson, but rather when appropriate and when it adds value to the learning. It’s the most normal thing in the world. Tablets are invisible. They’re used when required. No fuss. And that’s exactly what we set out to achieve. We wanted teachers to rely on tablets in the same way they rely on electricity to be available when they walk into a classroom, flick that switch and expect the electricity to work. Nobody plans a lesson in case the electricity doesn’t work! So, we set out to make tablet technology to be as reliable as electricity or as running water. We wanted it to be there on tap when needed.

Technology ceases to be considered as such when it becomes normal, when it becomes mundane. Cars were technology once — they were gadgets, if you think about it. But now everyone relies on them, they’re the most normal thing in the world. So, I really think that the debate that we are currently having about whether or not to have mobile devices in schools will soon be obsolete, as it slowly becomes apparent to more and more of us that the benefits of teaching and learning in environments where teachers and children are able to use mobile devices outweigh any negative aspects. Even what we currently see as big obstacles, such as the financial cost of 1:1 programmes on this scale, will eventually cease to be impediments, as the cost of using mobile devices is outweighed by the cost of not using them. Remember that cost can be measured in terms other than financial.

Where is the evidence?

One of the first questions that people often ask when they hear we have given all our students tablets is Where is the evidence? Why are you spending so much money on tablets? The suggestion being that there is no evidence to support the huge financial investment required to put mobile devices in the hands of every teacher and student.

This is wrong. There is evidence. It’s just not very compelling. To be fair though, we must remember that the iPad — the device which heralded the age of the tablet — is only 6 years old. In fact, my six year old son is slightly older than the iPad. So there are no longitudinal studies about the impact of tablets in schools. But there are studies that have looked at the impact of digital technology in a more general sense that have been brought together into meta-studies, such as this summary by the Education Endownment Foundation.

1x1ptclynbl7unudc-012ug

Very often, the people who ask this question — which, by the way, is a very good question to ask, I’m certainly not dismissing it — often point out that technology has been shown to only a moderate impact on outcomes, as measured by exam results.

The Education Endowment Foundation’s Toolkit has worked out from analysis of metastudies how impactful particular interventions are as measured by months’ gain (and yes, I am aware this kite-marking method of measuring effectiveness and impact is contested and has its flaws).

In this model of measurement, digital technology interventions add four months’s gain. Pretty pedestrian stuff. There are other interventions that add a lot more. For example, good feedback interventions have been shown to add eight months. That, in effect, puts children a whole academic year ahead in that subject compared to children who are not being taught in the same way. Pretty compelling stuff.

Meta-cognition and self-regulation strategies have also been shown to add up to eight months. Then comes peer tutoring — getting older students to teach or mentor younger students — which also has great impact, six months. In secondary, but not primary, where it adds no gain according to this research, homework interventions can add five months. Collaborative learning also adds five months, as does mastery learning.

Finally we get to digital technology interventions, which, as I pointed out earlier, add four months, far from the top of this list. To make things worse, statisticians tell us it is probably wise to assume there is +3/-3 error margin. That is to say, interventions which add gain in the region of plus three months to minus three months are probably neither here nor there in terms of gain and that the impact of these interventions is probably too close to neutral to be of any actual advantage.

Digital technology is only one month away from this error margin. In the minds of many, this begs the question: if digital technology interventions cost so much but their impact is questionably low, is it worth spending all that money on computers, laptops, infrastructure, tablets and what have you?

But we then realised that many of these studies upon which the EEF and others such as John Hattie had relied to work out the impact of digital technology went back decades to when digital technology meant taking children out of ordinary classrooms and placing them in front of monitors in computer rooms. That was the intervention. In many of these interventions, the objective was to measure how good computers were teaching instead of the teachers, not how technology supported the processed involved in great, successful teaching and learning. This is important. So, this is where the four months’ gain figure comes from.

So we thought this might be missing the point. We thought: what are computers really good at doing? Everyone carries a computer of some kind on their person these days: a laptop, a tablet, a smartphone. They wouldn’t do that if they weren’t useful. These mobile computers, among other things, are really good at connecting people. Teachers and students are people, so, could we, for example, use these mobile devices to amplify the effect of these feedback interventions? At this point it’s important to point out that the focus of these interventions would not be on the technology, but rather on the training teachers to give excellent quality feedback that is most likely to move on students’ learning.

In terms of meta-cognition, if all students had access to mobile devices, they could potentially have access to a greater amount of knowledge as curated by their teachers in digital learning spaces, as well as the information contained in the web. They could help themselves to learning. If you consider the fact that work-class office software (word-processors, presentation tools, email clients, calendars…) are freely available to all in education thanks to Apple’s iCloud, Google and Microsoft, self-regulation too could feasibly be supported by the use of mobile devices. In fact, at our school, we often see students scheduling appointments in each other’s calendars to attend clubs or rehearsals, or even Facetiming each other for help and support in the evening and at weekends if they are struggling with a particular piece of homework.

When it comes to homework, apps such as Showbie, which we use across the school as our preferred method of managing workflow and marking feedback when digital tools are used (we still use paper textbooks and exercise books too!), help enormously to improve both the quality and the range of options for homework tasks, as teachers can now set tasks involving video and audio recording and editing. As a teacher of languages, this means I can now set listening and speaking tasks, which have traditionally been assessed exclusively in the classroom, for homework. It also means that, using voice notes within Showbie, I can give really accurate immediate feedback to specific students as to how the words they find dificult need to be pronounced correctly, rather than dedicate valuable lesson time to do so.

So, this is how we interpreted the evidence. We didn’t say, ok, the evidence for digital technology interventions is not very compelling, so we won’t use any technology, as some folk appear to suggest. We said, hang on, what does the evidence say about how teaching and learning are most successful and how can digital technology support these processes and interventions?

If you’ve been studying the interventions on this last slide, above, you may have noticed I have not yet mentioned reducing class sizes, in the bottom right hand corner. According to this research, reducing class sizes hardly has any impact beyond reducing the number of exercise books a teacher has to mark. It appears to have very little impact on the learning.

The reason why I bring this up is because one of the original suggestions folk made as to how to better spend the money we were about to spend on tablets was to employ more teachers. We actually looked into this and it turned out that if we spent that money on teachers instead of tablets we could probably employ an extra three or four teachers. In a school of our size, this would mean a class reduction on average of under two students. On balance, therefore, we decided that the money would be better spent on supporting higher impact interventions through the appropriate and effective use of tablet technology. It seemed reasonable.

Please don’t misinterpret me and assume that I think teachers aren’t important. I think that great teachers and great teaching are the key factor in successful learning and technology is the servant to great teaching and learning, not vice versa. I don’t think that technology will ever substitute teachers in a formal school environment but I do think that technology can amplify the reach and impact of great teachers.

The bigger picture

13nsf1p9bv4l86m_7l2ah0aWith all this in mind, what I then try to do is to get people to look at the bigger picture. This is a photograph (above) of one of my students in a Spanish lesson. She is using her tablet to test her vocabulary with self-marking activities and language games. When looked at in isolation, it is easy to counter that she could be doing that task on an exercise book, which is much cheaper, and that there is no need for an expensive tablet. They would probably be right. She also uses her iPad to access her textbook, which is online and contains grammar exercises and all the listening activities ancillary to language teaching and learning. You could say then that buying the paper textbook is cheaper than buying a tablet. Again, this is correct, textbooks are indeed cheaper than tablets. She also uses her tablet as a planner to self-organise. But she could use a paper planner for this. Linked to this is her class work and homework tasks, that I set, collect, mark and feedback on using Showbie, the tool I mentioned earlier, when appropriate. This saves a great deal on printing costs, as worksheets can be delivered and completed digitally. But, once again, in isolation, the cost savings on printing fewer sheets do not justify the cost of the tablet.

1oipjqfdem478tyvisugsdq

Using mobile devices and the internet means that intellectually engaging low-stakes testing are available to students on tap, both in the classroom and beyond; that textbooks come alive with multimedia content supporting learning when appropriate (e.g. in language learning); that homework can be set, marked and feedback delivered instantly from teacher tablet to student tablet without exchanging bits of paper so often, speeding up the learning process as a result; that planning and self-organising are more effective, as the tablet operating system integrates with Outlook, digital planners and other tools that ensure tasks are planned and completed in a timely manner, helping students to better keep up with the work the need to do using the same tools the rest of the world uses. This is not about preparing students for the future. It’s about preparing students for the present.

1zfxb_onbngaqjwi2uxftqw

This is why it is important to look at the bigger picture by increasing the granularity and seeing all of these mobile device affordances in context. We have found at our school — examples of this above — that, although tablets are not used exclusively (something we are very happy with, by the way, for us it’s never been either/or), they have been integrated almost seamlessly into the processes involved in teaching and learning.

1mutmu72lgvxim3ht7litrg

We find our students use tablets to read for pleasure at break; to write English essays in the gardens; to research History in the library; to film a movie in the target language in a French lesson; to access teacher-curated resources on our digital learning spaces; to communicate with each other after school for support; to record data and experiments in the science labs; to take part in a whole class quizzes which deliver detailed diagnostic feedback immediately; we find teachers and students using them as visualisers and projecting the images wirelessly to our interactive whiteboards… the list is almost literally endless.

Painting an accurate picture

A t this point, I’d like to illustrate all of the above with a short three minute video which depicts, using actual lesson footage, how tablets are used at our school.

As you can see, students move seamlessly from one medium to another. From paper to tablet and from tablet to paper. Also bear in mind this video recorded tablet use only, so please don’t go away thinking that tablets are used all the time. Teachers and students use them when they add value. That’s it.

1gmxyyxnklcvfvvpc4bxihqHaving listened to me today apparently harping on about how technology can support existing practices, it may be tempting to assume that if mobile devices are not visibly transforming the way we teach and learn, then what is the point? After all, in lessons, as this video shows, teachers are very often still the sage on the stage and students still often sit in rows.

But the way teachers can prepare lessons and students access learning is changing apace. This is not the revolution promised by many, but a more desirable evolution. Might technology one day transform education beyond recognition? Who knows? I always say that if you want to guarantee to be wrong, then go ahead and predict the future.

However, what we can say is that technology, when used effectively, can demonstrably support teaching and learning and that this more effective use does not arise from investiment in kit and hardware alone, but also from investing on making teaching and learning better, using the best tools for the job, digital or analogue.

It’s only when looked at from this wider perspective, and in context, that the value of teaching and learning with mobile devices such as tablets becomes apparent.

Thank you.

1ub-pv7iqmr5qeruerrvctw

Originally published on Medium.

Magic mirror on the wall Who is the fairest of them all?

Magic Mirror: What wouldst thou know, my Queen?

Queen: Magic Mirror on the wall, who is the fairest one of all?

Magic Mirror: Famed is thy beauty, Majesty. But hold, a lovely maid I see. Rags cannot hide her gentle grace. Alas, she is more fair than thee.

Snow White (1937)

In Snow White, the wicked Queen is determined on being the most beautiful in the land. The Queen takes this obsession to the extreme and orders her huntsman to take Snow White to the woods, kill her and bring back her heart as proof – or, as in the more gruesome original tale from the Brothers Grimm, her lungs and liver. But, despite all the Queen’s power and terrifying witchcraft, the Magic Mirror tells it as it is.

Earlier today I was reminded of this video, below, which originally made the headlines in the Washington Post back in 2014. The video depicts a professional development session in Chicago, USA as recorded – presumably surreptitiously – by one of the participants. Now, I’ve attended some pretty dire professional development sessions in my time, but, if this video portrays accurately what was going on in Chicago, it takes the biscuit by a country mile. Take a look for yourself:

The worst case scenario is that this is indeed how the professional event unfolded. The best case scenario is that these teachers are simply practising some of the new strategies they have learnt during this professional development event. The sort of strategies that when put to use in schools might make lessons look like this:

Now, it may be that you see nothing wrong in neither of these two videos. It may be that chanting in this way is normal practice for you and that you think that, despite the growing doubts about the effectiveness of schools which rely on kind of approach, it is a worthwhile activity in your setting. I won’t argue with you. If it works for you, who am I to say?

However – and I am glad you asked – personally, I suspect this kind of strategy probably falls under the category of what Dylan Wiliams calls “teaching that looks impressive” but actually has little impact on outcomes. Folks who use these techniques regularly may be just kidding themselves that children are learning better because they are doing exactly as instructed, in perfect choreography. Time will tell, I suppose.

However, what struck me the most was the unanimously negative reaction the Chicago professional development video received from teachers, who, all over the globe, immediately decried the lack of professionalism, the infantilisation of teachers and the appallingly poor teacher training to which these poor teachers were made subject. Teachers should not be demeaned like this. This is clearly why teachers are “going out of their minds”. No wonder we have teacher recruitment and retention problems, concluded the horrified reviewers.

Yet groups of schools in the US, such as KIPP or Uncommon Schools, populate the internet with videos in which adults treat children to precisely this kind of learning presumably routinely and hardly anyone bats an eyelid. Of course, you could reasonably say that it is ok for children to learn like this because they are children, and that, therefore, the comparison is not valid. You may be right. It may be that I’m wrong and that these techniques will solve our recruitment and retention crisis – though I would worry they would do so by attracting the sort of teacher who thinks teaching can be distilled into a little bottle labelled “take twice a day and teach like a champion”.

But, I did wonder if the negative reaction to the Chicago teacher training video could be more easily explained by asking: Did we look in the mirror? And did the mirror tell it as it is?

Queen: Magic Mirror on the wall, who now is the fairest one of all?

Magic Mirror: Over the seven jewelled hills, beyond the seventh fall, in the cottage of the seven dwarfs, dwells Snow White, fairest one of all.

Queen: Snow White lies dead in the forest. The huntsman has brought me proof. [holds up her opened box] Behold, her heart.

Magic Mirror: Snow White still lives, the fairest in the land. ‘Tis the heart of a pig you hold in your hand.

Queen: [repulsed] The heart of a pig?! Then I’ve been tricked!

Photo credit

The deception that elevates us – How the way you think you learn best isn’t the best way to learn

A deception that elevates us is dearer than a host of low truths – Alexander Pushkin

The nature of learning is complex. The most effective learning strategies are not intuitive at all and often run counter to what we believe makes for successful learning. In this blog, I will aim to pin down with a bit more clarity what learning is and to explore some of the implications for curriculum design at a time when schools are facing the introduction of new specifications at GCSE and A level.

The nature of learning

In my own experience as a student, cramming into my head facts, equations and information that I was able to regurgitate pretty much verbatim in tests made me feel like a very successful learner. My evaluation sheet at the end of every term looked pretty impressive. I was happy. My parents and teachers were happy. Everyone was happy. But I had a shameful secret: by the time the evaluation sheet got published, I could hardly remember anything from the tests in which I had apparently done so well. So, if I hadn’t retained my knowledge beyond the test, had I really learnt anything at all?

Well, it depends on how you define learning. If learning means being able to recall information when required, say in a test, then yes, I had learnt. However, in Make it Stick, Brown, Roediger and McDaniel define learning as the acquisition of “knowledge and skills and having them readily available from memory so that you can make sense of future problems and opportunities”. But the word future in this definition has always nagged me. How far in the future? The following morning? In the end of year exams? Forever after?

2000px-forgettingcurve-svgEarly research in cognitive psychology highlighted the ephemeral nature of memory retention. This was most famously illustrated by Ebbinghaus and his Forgetting Curve. Ebbinghaus’s research predicted that if I studied osmosis and I were tested the day after my studies, I would probably do very well in that test. The conclusion would be that I had indeed learnt about osmosis. However, if I were tested on day 6 instead, we would probably reach opposite conclusion: I had learnt close to nothing about osmosis.

Since Ebbinghaus showed that memory fades quickly in a matter of days, it would be tempting to question what the point is of learning facts, events, dates or indeed anything at all. But, as Benedict Carey shows in How we learn, although forgetting “seems like the enemy of learning [] the truth is nearly the opposite”.

It turns out that the relationship between learning and forgetting is much more complex than I had initially assumed. To my surprise forgotten and not learnt are not synonymous. Forgetting is not some sort of brain design flaw. It is not a passive process at all, but an active process of filtering – a feature we have evolved to help us continue learning. Carey confirms this and writes “forgetting a huge chunk of what we’ve just learnt, especially when it’s a brand new topic, is not necessarily evidence of laziness, attention deficits, or a faulty character. On the contrary, it is a sign that the brain is working as it should.”

So, was I too harsh on my younger self? Or was I and were my teachers unwittingly deceiving ourselves that I was, in fact, learning even though most of the knowledge I acquired for tests faded away after just a few days? Well, yes and yes, on balance. As Daniel Willingham points out in Why don’t students like school? “if you pack lots of studying into a short period, you’ll do okay on an immediate test, but you will forget the material quickly.” The sort of learning that my teachers encouraged me to do all those years ago is very much like the sort of learning that, to this day, I hear teachers promoting among their students, in the belief that spending hours upon hours rereading material will result in long term retention.

This belief is the deception that elevates us, that we use to validate our efforts as teachers and students. You did well in the test, therefore you succeeded. But it is a mirage, because the feeling of fluency, due diligence and knowing that we get by spending hours revising a topic is not really knowledge, but the illusion of knowledge. Fortunately, as well as setting out the problem, Willingham also hints at the solution. He writes “if on the other hand you study in several sessions with delays in between them, you may not do quite as well on the immediate test but, unlike the crammer, you’ll remember the material longer after the test.”

Implications for curriculum design

Many of us will be facing the introduction of new GCSE and A level courses next year. In these new courses, linear approaches are heavily favoured over modular approaches, with a much reduced emphasis on coursework components (which are, in fact, being removed altogether in many subjects). Subjects which found themselves shackled to the module-assessment-module-assessment model will now find that they have much more flexibility in terms of how their subject content can be delivered. It is a great opportunity for curriculum designers to plan not just the teaching of these new curricula, but also the kind of learning that they would like to promote among students.

A single, simple quiz after reading or hearing a lecture produces better learning and remembering than rereading the text or reviewing lecture notes – Brown, Roediger and McDaniel

Brown, Roediger and McDaniel sustain that “spaced repetition of key ideas and the interleaving of different but related topics” are the essential components of successful learning and long term retention. These are the low truths that compete against the deception that is learning by cramming. In practice, this means eschewing the modular approach to learning about a subject and enshrining the revisiting of topics and concepts in our our schemes of work and programmes of study, as well as the linking of topics which would have previously stood alone as single units.

Revisiting topics and concepts would naturally encourage the sort of strategies that support effective long-term learning, making the most of the way our brains allow information to fade away to form the foundations of new learning. As Carey reminds us, forgetting stuff is fine, “some breakdown must occur for us to strengthen learning when we revisit material”.  It is this revisiting that helps us secure knowledge for the long term, building taller on the residue left by previous visits to the topic.

These strategies might include frequent low-stakes testing (they could take the form of self-testing, games or quizzes – not all testing is evil or boring). According to Brown, Roediger and McDaniel “retrieval practice – recalling facts or concept or events from memory – is more effective learning strategy than review by rereading. Flashcards are a simple example”.

Retrieval strength – i.e. how easily a fact, concept or a piece of information comes to mind – without reinforcement drops off quickly. This can result in excellent exam results but poor long-term retention. Ebbinghaus showed this all that time ago. What curriculum designers need to do is take greater care in planning the learning of their subjects, not just the teaching. They need to consider, not only what the teachers will do and what resources they will use, but also what the students will be doing and thinking, so that great learning is encouraged, not just given lip service to. Otherwise we will continue to deceive ourselves into thinking we are learning when we are really not.

Further reading

Carey, Benedict. How We Learn: The Surprising Truth about When, Where, and Why It Happens. New York: Random House, 2014

Roediger, Henry L., Mark A. McDaniel, and Peter C. Brown. Make It Stick: The Science of Successful Learning. N.p.: Harvard UP, 2014

Willingham, Daniel T. Why Don’t Students like School?: A Cognitive Scientist Answers Questions about How the Mind Works and What It Means for the Classroom. San Francisco, CA: Jossey-Bass, 2009

Photo credit