Vicariously virtual

At the LLAS (Subject Centre for Languages, Linguistics and Area Studies) E-learning Symposium on Friday 29 January 2010, Ming Nie and I, together with our avatars Ming Cham and Daffodil Moonwall, presented on our recent Second Life (SL) project with distance students on the Online MA in TESOL and Applied Linguistics. The presentation is described in the DUCKLING blog. Here I want to reflect on the combination of technologies used in the presentation.

This was a truly mixed-mode presentation, with a number of ways in which audience members could participate. For starters, there were approximately 30 people in the lecture theatre at the University of Southampton where I was presenting. They got to see me in real life, as well as Daffodil and Ming Cham on the big screen in SL. Meanwhile, Ming Nie was joined in real life in Leicester by a colleague from the MA course team, who had accepted our invitation to vicariously experience the virtual presentation. A few other colleagues joined Daffodil and Ming Cham in SL via their avatars. Finally, the event was also live streamed via video for anyone who wanted to see what was happening in the lecture theatre in Southampton from a distance. (Here I must apologise to my mum for sms-ing her the wrong URL – I misspelt “tinyurl” as “tinurl”, which had the effect of sending her off to an array of porn sites, thereby confirming her lack of faith in all things Web.)

For anyone who missed the live session and would like to see the recording, it is available, along with recordings of the other sessions from the symposium, at www.tinyurl.com/LLAS-livestream.

From a technical point of view, everything went very smoothly. Ming and I had our PowerPoint slides embedded in Second Life, and our avatars stood on either side of the virtual screen. I had two microphones – a lapel mike for the live streaming, and a headset mike for SL. We connected to SL without a hitch, and with the help of technical gurus on both sides (thanks Graham, Dean, Terese, Simon and Paul!) we managed to get excellent sound quality in both venues.

Despite the technical success, I am curious to know what value SL really added to the presentation. Yes, it allowed Ming to co-present with me from a distance. And yes, it gave a feel for the environment in which our study had taken place, which made it appropriate for the occasion. However, I wonder how exciting it was for members of the audience to watch two rather stationary avatars standing at the front of a virtual lecture theatre, and speaking to PowerPoint slides with disembodied voices…

Perhaps both Daffodil and Ming Cham need to learn the art of moving unobtrusively and gesturing while speaking? (Not as easy it sounds though, because as soon as you move your avatar, you also lose your view of the presentation screen at least momentarily, which could be very disorientating for any audience members watching the screen from the presenters’ point of view.) Perhaps we could have tried lip-synching, so that it was clear which avatar was speaking? Perhaps we could have hovered over the screen rather than standing next to it? Perhaps we could have used a more visually attractive presentation format such as Prezi, rather than PowerPoint? Perhaps… we could get some creative ideas from blog readers for the next time we try something like this?

Gabi Witthaus, 1 Feb 2010

Follow

Get every new post delivered to your Inbox.

Join 639 other followers

%d bloggers like this: