Last week I was accused of being a luddite. Me. A luddite. This came about after I was critical of Google Glass, which apparently is the next big thing .
When I first saw the amazing video Google has produced to promote Glass, my jaw dropped. Science fiction had become science fact. Truly astonishing stuff. However, as a short-sighted, fashion-conscious, daily glasses-wearer, I immediately started having doubts about its practicality. Would I want to wear Glass all the time? If I took it off, would it fold neatly into your shirt’s pocket? Would I really want to talk to a computer I’m wearing on my nose in public? On the bus? Walking along the street? And, wouldn’t I look like an idiot wearing it?
To my surprise, I found I wasn’t so worried about Google’s plans for monetizing Glass, or about privacy concerns or about where this technology might lead us. Instead I found myself worrying about fashion, vanity and practicalities and my conclusion was that no, I probably would not want to wear Glass in most circumstances. For me, a small tablet device is still the best option. So I said so. And thus I became a luddite.
The truth is that I rather like the sort of company that Google is becoming: creative, innovative, bold and brave. Apple is reputedly also trying to come up with the next big thing, which, for them, apparently is a wrist watch. But this comes just as fewer and fewer of us feel the need to wear a watch at all. Try this: next time you’re in school ask for a show of hands if your students are wearing a wrist watch. I bet you only very few hands will go up, if any at all.
Cue then Google vs Apple rivalry, which is when things get really silly and we start behaving like small children in a playground – my dad’s Apple is bigger than your Google – and we begin to lose sight of what the next big thing really is, which isn’t a pair of bionic glasses or even a speaking wrist watch. The next big thing is that access to the social internet is no longer limited to the little tablet in your pocket (or the big tablet in your bag). So, let’s not get mired in and diverted by pointless tribal arguments and let’s start exploring the enormously massive implications for our schools of ubiquitous internet access.
Earlier this week I led a seminar for PGCE students at Nottingham University on the use of the internet and its potential for encouraging pupils’ creativity. To start, I asked those present to put their hands up if they used the internet daily. All hands went up. I then asked them to keep their hands up if their pupils used the internet on a daily basis. After a moment’s thought, all hands stayed up.
However, when I asked the PGCE students – who had all finished their first teaching placement – to keep their hands up if they had planned or been encouraged to plan lessons, sequences of lessons or homework that required the use of the internet, all hands went down. Isn’t it curious, I asked them, that all of you and and all of your students use the internet daily but none of you exploit its potential for teaching, learning and creativity? Isn’t it curious that schools force their students to inhabit this alternative reality for six or seven hours every day where the internet doesn’t exist?
Students who have entered secondary education in the last two years can’t remember life before social media. Despite this, the schools tasked with their education often fail to grasp the important role that social media plays, not only in the private lives of their students, but also in the wider school community.
In this context, young people’s use of social media tends to be unfairly misrepresented and very unfavourably portrayed by schools and teachers who, perhaps, feel constrained by the circumstances and pressures in which they work and who might fear a loss of control leading to a capitulation to what they perceive as a preference for immediacy among the current generation of students. The overall conclusion is often that social media is a disruptive force which further erodes academic rigour and undermines the teacher’s traditional role and relevance, thus proving in the eyes of many sceptics that social media is unfit for academic purposes.
A Level results came out last week. In a year which has seen the number of top grades reduced nationally for the first time in decades, Nottingham High school – my school – has seen, not only a continuing improvement, but its best results ever (72% A*-A), a feat that saw us move up to the top ten independent schools in the country.
In Spanish – the subject I teach and for which I am directly responsible – our results have also been our best ever (88% A*-A; 100% A*-B). Few of my students would have believed this possible at the beginning of Year 10, when they could barely say their names and where they lived with any confidence at all! Four years on, thanks to their hard work and dedication to the subject, they have done themselves – and me – very proud indeed.
It was during these four years that I began to research the transformational potential of social media and and ICT in general and to apply some of my findings to my teaching practice. Many fantastic things happened during those four years: my wife and I had another boy, my work in technology integration started to be recognised nationally and internationally, I was fortunate to be promoted to Head of Modern Foreign Languages and I gained a Masters Degree in ICT and Education.
However, during that time there have also been plenty of those who have questioned my approach for having the audacity to suggest that social media in general – and social networking in particular – could be harnessed by schools to be potentially beneficial to both teaching and learning.
In the early 1950s, my grandparents leased a humble patch of farming land in rural Andalucía where they were able to grow crops and graze their modest herd of dairy cows. There they built a house and raised their children.
In those days if you needed water, you had to dig a well. So they did. Incredibly they managed with a well and without electricity – using butane and paraffin lamps – for another 20 years.
By the time my dad was a young man, my grandparents had invested in a diesel generator that would allow them to run some electric appliances and watch TV or listen to the radio in the evenings. Eventually, round about the time my mum and dad got married and I came to being in the mid seventies, civilisation arrived and they were able to connect to the mains for both water and electricity.
As you can imagine, the well and the generator, which had so faithfully served the needs of my family all those years, quickly fell into disuse.
I was recently at a conference where I met a newly appointed school leader who had heard of my work in technology integration. As we were introduced, the expression on his face gradually changed from your-name-rings-a-bell to ah-I-know.
He was interested to know what the latest trend was in technology in education and quick-fired some questions about using technology in the classroom, answering quite a number of them himself with a lavish sprinkle of the latest buzzwords.
He was, it turned out, on the look out for “the latest innovative practices”. As if innovation was a product you could purchase wholesale at conferences.