In my first year as a PhD candidate, in 2000, I helped make a video (stood around with a few spare cables, to be precise) for Tomorrow’s World on a device called HANDLER. We showed children using this device on a history field trip to Birmingham’s canal system.
Prof. Mike Sharples (now director of the Learning Sciences Research Institute at the University of Nottingham) had put together this “HAND-held LEarning Resource”, a tablet PC with touch-screen, camera, text editor, sketchpad, cognitive mapping tool and mobile phone. Each group of children had a device, which gave them their “mission”, allowed them to record their findings (as video, stills, text, sound and sketches), and to communicate with other groups.
The HANDLER project at the University of Birmingham researched the educational possibilities of a personal learning device, an always-with-you always-on, information and communication tool.
Sound familiar? Well, it’s not the iPad. Not quite. Not yet. Sorry.
Well, the plan was: “It will enable people to browse the web and run multimedia software while on the move, to capture and store images and sounds, to annotate these with notes and sketches, to include this information as part of phone conversations, and organise and share it with people around the world.” (Sharples et al. 2001)
There are two further challenges to be met before Sharples’ vision becomes a reality. One is hardware. Communication is CRUCIAL to success, because feedback from peers and tutors is crucial to effective learning. Skype would do, but that only works in WiFi areas, so yes, a mobile phone voice connection would be really good. The camera is also important, because capturing and sharing images is sometimes the only practical way to have a joint representation of a concept under discussion.
The second challenge is for software. The word is, INTEGRATED. As in “A learner might point the camera and take a picture…then annotate it using the drawing package and drop [that] into a page of notes. A graphical time-line holds all the learning items, including note pages, webpages, individual images, sounds and video.” (Ibid)
At present, the emphasis is on the application. You run a text editor. Or a drawing package. The front-screen of the iPad shows a list of applications.
It needs to show a list of resources – text, picture, combinations – and their relationship with each other. Applications need to change seamlessly and invisibly.
Does the iPad bring us any closer to this vision?
Is it cool?
Does it deliver enough to change the face of learning as we have known it?
So, almost pointless then…
Paul Rudman, BDRA