Last week I was accused of being a luddite. Me. A luddite. This came about after I was critical of Google Glass, which apparently is the next big thing .
When I first saw the amazing video Google has produced to promote Glass, my jaw dropped. Science fiction had become science fact. Truly astonishing stuff. However, as a short-sighted, fashion-conscious, daily glasses-wearer, I immediately started having doubts about its practicality. Would I want to wear Glass all the time? If I took it off, would it fold neatly into your shirt’s pocket? Would I really want to talk to a computer I’m wearing on my nose in public? On the bus? Walking along the street? And, wouldn’t I look like an idiot wearing it?
To my surprise, I found I wasn’t so worried about Google’s plans for monetizing Glass, or about privacy concerns or about where this technology might lead us. Instead I found myself worrying about fashion, vanity and practicalities and my conclusion was that no, I probably would not want to wear Glass in most circumstances. For me, a small tablet device is still the best option. So I said so. And thus I became a luddite.
The truth is that I rather like the sort of company that Google is becoming: creative, innovative, bold and brave. Apple is reputedly also trying to come up with the next big thing, which, for them, apparently is a wrist watch. But this comes just as fewer and fewer of us feel the need to wear a watch at all. Try this: next time you’re in school ask for a show of hands if your students are wearing a wrist watch. I bet you only very few hands will go up, if any at all.
Cue then Google vs Apple rivalry, which is when things get really silly and we start behaving like small children in a playground – my dad’s Apple is bigger than your Google – and we begin to lose sight of what the next big thing really is, which isn’t a pair of bionic glasses or even a speaking wrist watch. The next big thing is that access to the social internet is no longer limited to the little tablet in your pocket (or the big tablet in your bag). So, let’s not get mired in and diverted by pointless tribal arguments and let’s start exploring the enormously massive implications for our schools of ubiquitous internet access.
I was recently at a conference where I met a newly appointed school leader who had heard of my work in technology integration. As we were introduced, the expression on his face gradually changed from your-name-rings-a-bell to ah-I-know.
He was interested to know what the latest trend was in technology in education and quick-fired some questions about using technology in the classroom, answering quite a number of them himself with a lavish sprinkle of the latest buzzwords.
He was, it turned out, on the look out for “the latest innovative practices”. As if innovation was a product you could purchase wholesale at conferences.
My definition of innovation is this: innovation is doing things in ways you didn’t realise you could. In education, as in other fields, innovation is so much more than the application of new technologies. But it must be acknowledged that new technologies have often acted as the catalyst for innovation, not just in education but also in all areas of society.
The realisation that you can do things differently – in ways you didn’t realise you could – does not come suddenly to most of us. Innovation is the progressive development of awareness and the gradual appreciation of an alternative model.
The management of the hotel where the above sign was placed in the 1880s was attempting to help its customers come to terms with a new paradigm: electricity was challenging patterns of behaviour established over centuries and, just like the internet is doing today, it served as the catalyst for a wave of innovation.
I know very few teachers who don’t rely on technology of one kind or another to help plan and deliver their lessons. I know fewer students still who do not rely on the internet to help then learn. Yet both teachers and students often try to use technology to support well-established patterns of behaviour in an inadvertent attempt to perpetuate that with which they are familiar.
Instead I would like to propose that schools embrace innovation, not as a target or a policy, but as a culture, that is to say that schools need become places in which innovation – as defined above – is allowed to grow and flourish. But for that to happen, we need to stop attempting to light bulbs by striking matches.
Science is defined as “the intellectual and practical activity that encompasses the systematic study of the structure and behaviour of the physical and natural world through observation and experiment”. Science relies on the accumulation of previously acquired knowledge. Scientists collaborate and learn from one another. They observe, test and experiment so that new knowledge can be obtained.
Now contrast that with Education. When it comes to Education, dogma trumps evidence and strongly held beliefs win over testing, experimentation and innovation. Whereas in Science they tinker, tweak and fix, in Education we prefer to say if ain’t broke, don’t fix it.
As a result, it is very sad to think that whilst Science is taking us on a marvellous and unceasing voyage of knowledge and discovery, Education remains stuck somewhere in the 19th century.
In my view, Education needs to take a leaf from Science’s book: we need to encourage research and experimentation so evidence can be obtained on which to base our practice. There really ought to be no room for dogma or belief, however strongly held. We can do better than this. We have to do better than this.
In our age, when men seem more than ever prone to confuse wisdom with knowledge, and knowledge with information, and to try and solve problems of life in terms of engineering, there is coming into existence a new kind of provincialism which perhaps deserves a new name. It is provincialism, not of space but of time; one for which history is merely the chronicle of human devices which have served their turn and been scrapped, one for which the world is the property solely of the living, a property in which the dead hold no shares.
'What is a classic?’ Presidential address delivered by TS Eliot to the Virgil Society in 1944. Published in his volume of essays entitled On Poets and Poetry, 1957.
In this passage, TS Eliot denounces what he termed temporal provincialism, a phenomenon by which we undervalue past experiences in favour of the present and the instant gratification it promises.
Teachers opposed to the embracing of new technologies and the adoption of modern computer mediated means of communication often use similar arguments against those who propose the transformation of teaching and learning by exploring and exploiting the potential these new technologies may have to offer.
The internet is often criticised by teachers for prizing information over knowledge and for being a capitulation to what they perceive as a lack of academic rigour and preference for immediacy among the current generation of students. Similarly, the use of social networking sites is often disparaged and even vilified for infantilising young people’s brains and reducing their ability to communicate face to face , as if social networking were a substitute for face-to-face communication.
Such received wisdom may well be full of common sense, but it is actually unsupported by research and, upon closer scrutiny, it reveals itself to be based on assumption, misunderstanding and preconception. Actual research on the subject suggests that even the humble internet search is a valuable meaning-making activity that supports the acquisition of knowledge, the creation of remote associations and creative development. And internet searches are just the tip of a very large iceberg of untapped potential.
TS Eliot’s temporal provincialism condemns the overestimation of the present’s importance. However, I would propose that today we suffer from a kind of temporal conservatism, whereby undue relevance is being given to present, more traditional methods of teaching and learning whilst the future potential of promising new technologies is being largely ignored by schools that are blinkered by the here and now.
I have recently been very troubled by the realisation that sometimes some people will not see past my peculiar name or my foreign accent and will make prejudiced assumptions about my competence in my profession or suitability for a role.
A thread I started on Twitter on this topic confirmed that, sadly, I am not only foreign educator working in the UK who feels other people’s perception of our competence is linked to factors beyond our control, such as our country of origin.
I certainly don’t wish to portray myself or anyone else as the victim of discrimination or injustice. On the contrary, I am fully aware that I am just as guilty of prejudice and bias as the next person. Which is what really concerns me.
The irrationality behind such prejudice and bias and how it determines why new ideas are adopted or discarded by us all is probably behind many of the decisions we make – if not all.
I have often joked that one of the best ways to get school leadership to adopt your idea is to craftily make it look as if it was their own. It turns out my instinct may well be right after all.
Dan Ariely explains, below, how we instinctively place more importance on our own ideas than on those of others. It’s a well-documented psychological phenomenon called the Not Invented Here Bias.