The Information Age of Things

Background
After the dissolution of the Institute of Learning Innovation, the EMMA project was taken over by the University of Naples (Federico II), and Prof. Grainne Conole, myself and Dr. Brenda Padilla were contracted to create and run two MOOCs on the EMMA platform. I was responsible for “21st Century Learning” – a MOOC about innovation in pedagogy and related technologies since the turn of the century.

The discussions amongst participants generated all manner of questions. One in particular set me thinking. During the week on Virtual Reality, the question arose as to how different terms should be defined. The accepted definitions for these, and various related terms, have varied over the years, but I want to try and pin down some definitions. I’ve tried to create ones that relate to the experience, independent of any technology.

artificial reality
virtual reality
virtual environment
virtual world
immersive and not immersive
augmented reality
avatar
avatar vs first-person view
telepresence

A contemplation
Let’s start with the question, “So, what’s wrong with ordinary reality?”

And I suppose the answer is, “Nothing.” Except that it’s a bit restrictive. I can’t fly, or teleport, I don’t have x-ray vision, I can’t swim to the bottom of the sea or visit Mars, I can’t shrink myself to an ant and explore a forest, I can only run so far without having to have a good lie down… you get the idea.

Now it may be that I have an over-active imagination and want to do lots of unreasonable things, but I don’t think so, because the last few hundred years has seen people using technology to extend their reality in all manner of ways. Since the 16th Century, if not before, there’s been a stage illusion called “Pepper’s Ghost”  where a sheet of glass is placed at 45 degrees across the stage, so that the audience see both through the glass – the main stage – and a reflection of a person or object off-stage – which can appear to float in mid air.

As soon as film was invented, people started to create images that weren’t real, such as the Cottingley Fairies from 1920 – photographs of paper cut-out fairy shapes that were believed to be real  And that was just the start of our love affair with the unreal. So many images we see nowadays have been adjusted, or “photoshoppped”, that people are starting to question whether we may now have strayed too far from reality.

In 1937 the Walt Disney Studios created its first feature-length animated film (with a quarter of a million hand-drawn images…): Snow White and the Seven Dwarfs. The first feature-length animation made entirely from computer generated imagery (CGI) was Toy Story in 1995. Even “ordinary” movies contain so many “special effects” that they are a long way from the reality that existed during their making.

Humans, it seems, are not satisfied with reality – at least, humans in modern Western society. But why? Damien Walter wrote an interesting article, leading to the question: “Do our fantasy worlds help us to escape, not from reality, but from our own limitations?”  Maybe so.

And now the escapists – that is, all of us – have technology so advanced it would, not so long ago be indistinguishable from magic (to misquote Sir Arthur C. Clarke), and so basic that in 100 years we will be called “primitive”.

So, let’s start at the other end and work backwards. What form will our escape from reality take in 100 years time? Ok. I have no idea. So let’s try a related question. What form would we like this escape from reality to take in 100 years time?

I’m sitting in an armchair, in a room at home. It’s a nice home, but really, I’ve sat in it quite a bit since I moved here. Wouldn’t it be nice if I could change the appearance of the walls at the touch of a button? No, at a voice command. “Make the walls light blue”, and now the walls look as though a painter’s been in. Except there’s no painter, and no paint, and if I turn off the piece of technology that’s making the walls look blue they’ll jump back to being cream.

What is this technology, you may ask? Well, we’re still 100 years in the future, and it hasn’t been invented yet, but please bear with me a bit longer…

How about, I don’t like walls at all. Let’s remove them. “Remove walls, create a country scene”. So, now there’s a coffee table in front of me, and then a sofa, and then a hedge, some trees, and a field of sheep. A robin flies down and perches on the edge of the coffee table. It feels like I’m sitting in the countryside. Of course, I know the hedge is actually a brick wall, so I’m not going to try and jump over it, and the sheep won’t wander into the room because, well, they are virtual sheep, computer generated, and the computer knows that small birds are pretty and sheep are, well, annoying.

I call my friend, and we have a chat. She’s sitting opposite me, on the sofa. Except she’s not of course. She’s sitting on her own sofa miles away, but it feels like she’s here, and – from her perspective – I’m sitting in a chair at her house. Of course, I can’t hand her a coffee, because she’s only virtually here – a telepresence – but that’s ok, because it feels as though she’s here, which is the main thing.

I’ve tried to create definitions that would fit this scenario just as effectively as today’s technology, so that the definitions describe the experience, not the technology.

artificial reality – coined by Myron K. Krueger, this is the broadest definition: anything that appears to be real – completely real and present, genuinely mistakable for a real, ordinary, physical experience – but is actually an illusion. We don’t really have the technology to do this yet, but we will, and sooner than one may think. The experience doesn’t have to be computer-generated, although that is our current best technology. The obvious fictional example is Star trek’s “holodeck” but I don’t want to get hung up on technology – we don’t need any specific technology for these definitions, they are about the experience, not the hardware.

virtual reality – artificial reality that is not completely convincing – it is apparent that something is amiss, or the presentation requires some imagination. So, for example, anything you need a screen to see, or some sort of head-mounted display that’s sufficient to remind you that something odd is going on, or objects don’t behave as real objects – pixellated, juddery, not fully believable. The experience is not “real” in some noticeable way. The big thing is that humans are very adaptive, and can gain a lot from experiencing reality in a non-realistic form – from feature films to Second Life.

virtual environment – sufficient objects using virtual reality to feel as though you are in a location, and the ability to move around that location. So, not just a virtual teapot, but a whole room. Or countryside. Or Mars. A virtual environment can include other people that one may interact with, usually represented by avatars – characters that don’t necessarily look exactly like the actual person (or look completely different, such as a cat). (I’ll talk more about avatars in a while.)

virtual world – a versatile virtual environment that appears to be, or is effectively, infinite. So, not just the one virtual space, but numerous different spaces, and usually the ability to create new spaces with virtual objects at will. Coming back to artificial reality, if technology ever reaches the point where it can create a virtual world indistinguishable from the real one (i.e. you can physically walk around and interact with non-existent things) then I would call that an artificial world.

immersive vs not immersive – this is really subjective, and depends on whether someone using this sort of technology reaches the point where they suspend disbelief and act – within reason – as though the experience is real. Clearly, this is easier with some experiences (e.g. a modern racing car simulator) than others (e.g. the “Pong” video game), but it also depends on the person as to whether they make the necessary imaginative leap from a virtual experience to a real one. (Of course, by my definition, an artificial experience wouldn’t require any imaginative leap, so would, by definition, be immersive – unless, that is, the person knows it’s artificial and deliberately treats it as not real).

augmented reality – this is where the real world and virtual (or artificial) reality are both present at the same time, like the robin on the coffee table. Microsoft’s HoloLens actually comes remarkably close to this (as we saw in the MOOC), despite still needing a lot of development.

avatar – computer-generated character that represents a person in a virtual environment. As in the book Snow Crash, the representation of people in virtual spaces will move forward significantly when actual facial expressions can be represented accurately in the virtual. Lindon Lab’s Project Sansar promises to go some way towards this  In the long run, I see avatars as becoming increasingly realistic to the point where they look like real people – although, like today’s avatars, not necessarily like the actual person.

avatar vs first-person view – in my opinion, following an avatar all the time is a hangover from virtual reality’s gaming past. In some of the interviews for the SWIFT project (where we created and trialled a virtual genetics lab in Second Life) I asked participants if they felt they identified with the avatar (a concept that seemed important at the time). Generally, I got the impression that participants thought this to be an odd question. Participants generally had the feeling that they themselves were doing the experiment – not the avatar, and not something they watched on video. Indeed, two of the interviewees said that the avatar just got in the way (obscured their view of the lab bench). So I would not refer to “first person view”, any more than I think of my reality sitting here typing as “first person view”, it’s just how I see the world – how I’ve always seen the world. As we move forward towards artificial reality, I think it’s time to leave viewpoint behind and just think about reality – what I see is how it is.

Telepresence – the experience of being somewhere in the real world other than where one is. Telepresence often refers to some form of videoconferencing, such as Edward Snowden’s TED talk but also extends to control of distant robots with various amounts of realism. From the perspective of other people at the remote location, the robot or video screen represents the person engaged in telepresence. The film “Avatar” imagines a highly sophisticated version of telepresence. From the experiencer’s perspective, telepresence is similar to virtual (or even artificial) reality, but the big difference is that the environment in which they find themselves is actually a real environment and, unlike virtual or artificial reality, may contain physical people, animals, etc. The difference is important: in virtual (or artificial) reality other people are not actually present so cannot be physically harmed, whereas the environment the telepresent person is in is real, and real harm can occur to the people in it (but not to the telepresent person, of course).

Conclusion
I think, over time, the need to distinguish between these forms of reality will diminish. We will get used to the idea of virtual objects and experiences being part of our lives, to a greater or lesser extent. A virtual robin on the coffee table will seem ordinary – it will just be a robin and a coffee table. Probably some slang term will appear for virtual objects. Maybe something like “Don’t bother feeding the robin, its holo”. There will be shops full of models walking around displaying the clothes, changing instantly to match what a computer decides nearby shoppers might like, and the shoppers will know that there are no models, but it will seem normal. There will be parks, and art galleries, and sports arenas that are just empty concrete spaces, but we will only know that if we ignore the trees and paintings and action in front of us, and stop and think.
It will be The Information Age of Things.

Paul

Dr. Paul Rudman, April 2016, Leicester, UK
paulrudman.net

 

Free Open Access Medical Education

For some years now, I have noticed that medical educators are looking at learning innovations in their own unique way. I first became aware of medical education happening in virtual worlds and simulations, such as Coventry’s virtual maternity ward in Second Life, and St George’s paramedic training in Second Life. 

Screen Shot 2013-07-09 at 22.05.51

Damian Roland argues for the use of social media in a 26 June debate held at University of Leicester

Our own University of Leicester brought medical students into a virtual Genetics lab as a way of offering additional training in Genetics testing. Dr Rakesh Patel and his team developed a Virtual Ward (still going on today), in which students may visit virtual patients and practice coming up with a diagnosis. When I tweeted about these kinds of initiatives, I would receive replies using the hashtag #meded or #vitualpatient.

But last year I began to see a new one on Twitter: #FOAMed — Free Open Access Medical Education — or just #FOAM — Free Open Access Meducation. I began to follow people like Anne Marie Cunningham (@amcunningham) , Natalie Lafferty (@nlafferty), and Damian Roland (@Damian_Roland), among others who, as medics and medical educators, see the value of using social media in medical education, or the value of blogs, or the value of a crowd-sourced site of medical questions and answers such as gmep.org. Meanwhile, Rakesh was coming up with ideas thick and fast: why not tweet and record the Nephrology conference SpR Club this past April, and the TASME Meeting at DeMontfort University this past May? And so I did!

Then Rakesh and Damian got the bright idea to debate the motion: “This house believes that medical educators must use social media to deliver education.” The debate took place on 26 June at University of Leicester, and I was able to live-stream and record it, as well as join in the Twitter discussion. There were several remote participants including one from Canada, in addition to the approximately 20 attendees face-to-face at the Medical School. Not only did the debate spark real interest and a sense of challenge among those present (many of whom seemed to be new to the ideas of FOAMed and social media), the discussion continued on Twitter for a good couple of days, as the images below show. You can listen to and watch the video of the debate here.

Screen Shot 2013-07-09 at 21.00.43 Screen Shot 2013-07-09 at 20.59.32Now the ASME Annual Scientific Meeting is happening in Edinburgh, and Rakesh, Natalie, and others are presenting a workshop on FOAM. My name is on the presenter’s list as well, and although I could not attend, I shall be eagerly watching for tweets from the conference. I have come to see, especially through the eyes of my medic colleagues, that Free Open Access Meducation is a better education than closed— better because more information is accessed the wider one’s network is, better because more learners are reached via open platforms than closed, better because open encourages interdisciplinary sharing and learning… the list of benefits goes on.

Terese Bird, Learning Technologist and SCORE Research Fellow

%d bloggers like this: