The internet as we know it began for me in early 1995. I was a first year undergraduate studying Psychology and Artificial Intelligence at Middlesex University. One day, the AI tutor commented that “the library has got a computer connected to the ‘World Wide Web’ – you should go look at it, it’s going to be important”. Well, I did, and he was certainly right! Even without any search engines (just some bookmarks someone had set up from reading about sites in the newspaper), and a primitive “Mozaic” browser (now part of IE), the potential was obvious.
(I would like to be able to say that my engagement with AI was similarly stratospheric, but I dropped it like a hot potato in favour of a straight Psychology degree. AI was a good course, and not so difficult, but to me was reminiscent of, well, drying paint – here’s a clue)
So my first experience of the web was as something you “went” to access. Now we have the iPhone, Android, netbook, iPad and such like. Anyone growing up today will have a different first web experience. Here’s something to try. First, find a pen and some paper. . .
How long did that take? Did it seem an unusual request? If they weren’t immediately to hand did you feel surprised?
That’s what it must be like to someone growing up today when they think about the web. It’s an important difference, because a belief in the web as being ubiquitous shapes the way one goes about many things. Take learning for example (obviously, not a random example. . . ). I grew up with the paradigm that learning was something you did by being taught. Then I went to University and this changed to the idea that learning was something you organised yourself using resources provided by “experts” – books, lectures, tutorials etc. Growing up today is likely to include the assumption that learning resources are already there for every topic, waiting to be used.
As Emma described in a previous blog post, the Beyond Distance Media Zoo recently hosted a presentation by Professor Phil Candy of the University of South Queensland about the Four+ Scholarships in the Digital Age. One of the points made in Phil’s talk was that Universities are moving towards three primary functions: 1) providing a “Road Map” for navigating learning materials a subject (and also a template for building one’s own understanding), 2) Information Literacy (how to use the information effectively), and 3) Accreditation (evidence of successful learning).
I could imagine that someone growing up today may not value 1) and 2), since the information appears readily and freely available, with a free “road map” (Google / YouTube) and no obvious need for training in how to use the information (just read it / watch the video!). This would put the emphasis on 3) Accreditation, making a University course simply a shop for purchasing degrees, paid for by money and, to an extent, time allocated to study.
Yet Information Literacy is crucial. What if two YouTube videos have different messages? “Compare and contrast” may be a cliche, but it’s also a good way to understand different people’s views, while knowing an abstract representation of a subject – and how to apply it – allows for an understanding that can be used to question, to evaluate and to predict.
As Phil Candy also pointed out, it’s easy to focus on the technology (and getting information) and easy to miss the point, which is to gain understanding.
Information Literacy is the next big thing. You might want to jot that down.
Paul Rudman, BDRA