Virtual world training in 30 minutes

An interesting quesion arose from my ALT-C talk last week. It was basically “How can you use Second Life for teaching when it takes two hours to learn how to use it?”
Which isn’t really a question, of course. It’s a statement. Along the lines of “It takes my students two hours to learn to use Second Life”.

So, here’s a question in reply: Do you expect your students to be able to use MS Word? Yes? Including MailMerge? Macro programming? I suspect not. They probably just need basic formatting. Maybe headings. An index for the really advanced. And it’s the same with learning to use Second Life. Thirty minutes training is all that’s needed for most learners in Higher Education.

The key is to consider training as part of the overall design. Here’s what we did for SWIFT.
1) Define the Learning Objectives. For our second lab it was to practice evaluating experimental results and to learn the connection between theory and practice.

2) Design activities that will best support those Learning Objectives. In our second lab, the activity was to work through a sequence of experimental steps and results, answering quesions about procedure, interpreting results and seeing animations of molecular processes at critical moments.

SWIFT learner's avatar showing virtual lab and HUD and animation

3) Design the environment necessary for those activities. We created individual lab benches with replica equipment, and a Head-Up Display that acted as the automated guide.

4) Define the SL competencies necessary to accomplish those activities. So,

a) Walk – well enough to position the avatar in one place
b) Close the sidebar
c) Touch (click on) objects
d Chat
e) Zoom the camera in on one spot
f) Put on / remove a lab coat
g) Attach the HUD

Now, most of these only need to be done  once, and some will already be understood (like clicking on things) so there’s no need for lots of practice. All that learners really need to be good at is zooming the camera. So the 30 minutes is something like 10 minutes for the easy things, 10 minutes for the lab coat and 10 minutes for the camera.

Visitors in the SWIFT training area

5) Create or adapt a training area suitable for learning and practicing those skills (and only those skills, so the training area may need adjusting for different groups). There are many training areas in SL, some better than others. Ours is here. Basically, the avatar needs to be constrained until they can walk properly, instructions must be very clear to all, and tasks must be in a logical progression. We have adjusted our training area over the last 12 months using observation and in-world interviews and questionnaires.

And that’s it! We don’t teach them how to run, fly, IM, search, teleport, build, offer friendship, use weapons, drive vehicles … there’s quite a list, and if they choose to continue using SL in their own time and outside of the University island they will probably want to use many of these. And they may need MailMerge in MS Word for running their own business…

So, ask learners new to SL to sign up for an SL account on the web site in advance. Then in the class, when they first use SL, ask them to enter the location of your training area at the SL login screen (so they don’t wander round some public place) and the half-hour training will pretty much run itself. (Yes, really, you just need someone hovering to help the occasional student who uses existing knowledge or expectation in place of the instructions.) We would expect similar success with OpenSim implementations, but can’t speak from experience with these.

How well the actual lesson goes depends on many things, from what’s to be learned and how that’s represented in the virtual world, to how well the environment is built and how motivated the students (and teacher) are. Some things can be learned well in virtual spaces, others not. Some virtual world use is embarked upon with enthusiasm, some not. What we can say with some certainty though, is that SL training need not be a problem.

Paul Rudman,
BDRA

Why using virtual worlds for teaching just got easier

One of the Frequently Asked Questions about using virtual worlds as a teaching and learning environment is: “How much does it cost to prepare a learning environment?” )

Last week, Linden Labs (makers of the Second Life virtual world software (SL) ) added a new feature: “Mesh”. On the face of it, this could lower the cost of building suitable teaching environments within SL, but like most new features it’s hard to predict just how useful they will be. So I decided to try it out. . .

We are in the process of setting up the third SWIFT experiment in SL, and we need to create a simple building with some visual interest. We settled on an Egyptian-style pyramid. Until now, the standard (and almost only) way to build in SL was using “prims” – simple shapes one materialised (or “rezzed”) and manipulated within SL. Creating our pyramid with prims would look something like this (you add the “texture” – image of stone blocks or whatever – later):

With Mesh, you design objects first using a number of free or commercial programs and then import them as objects into SL. Apart from now having a choice of tools to use, there is one huge advantage: because the object is built out of lines rather than 3D objects, you only need think in terms of what you see, not component shapes that you have to imagine.

For example, in the picture above, I’m creating a pyramid out of triangular things. For a Mesh object, I can create it with lines, like this:

I’m using the free Google Sketchup program (that character is not an avatar, it’s just a 2D drawing, there to – I assume – give a sense of scale). Other programs are available, such as Blender  (better but not so easy to learn) and Maya (if you have a big budget!) Sketchup took a few hours to learn, but now I could recreate the pyramid in a few minutes – much quicker than using the building tools in SL.

Then, it’s a simple matter to export the shape as a file and import into SL. . . and, voila! A 3D pyramid in SL.

For the first SWIFT experiment I created a virtual PCR machine which, as I recall, took a whole afternoon to create out of textured prims. Even though much of the time was taken in preparing the textures (images taken in the real lab), drawing it using lines would definitely have been easier than shaping individual blocks, and the level of detail possible would have been greater too.

Mesh is still new, and there will be drawbacks for a while (there seems to be a bug that doesn’t let me walk to the far corner inside the pyramid, for example). Nonetheless, I’m really very impressed by the possibilities Mesh has to offer. I’m sure it won’t be so long before all the OpenSim grids support Mesh too.

So, if you were thinking of using virtual worlds for teaching and learning, things just got easier!

Paul Rudman,
BDRA

The icy winds of change

Social networking seems quite good at providing random, but surprisingly serendipitous, information.  Recently, academia.edu told me that someone had searched for a book chapter I co-authored a few years ago (Rudman et al. 2008). It included a paragraph of predictions for the future of elearning, and I began wondering how accurate our predictions were, even after only three years.

At the time, I was thinking some 10 years ahead. In fact, the future has arrived faster than I expected. We predicted two areas of technological development that would impact on learning. The first was the growth of personal, mobile technologies; we described a number of functionalities – communication by audio and text, sharing of media, GPS – and we were on the right path. What we didn’t see was  how effectively these functions would all be joined together in one device (i- and Android- phone) through the “App”. The second prediction was of data storage moving from individual devices to centralised servers (or “clouds”). This is happening too. I have, for example, recently redirected my personal email to a new gmail account, rather than the previous combination of hired server and Outlook, and now have access to email on my mobile too.

With hindsight, I would say that we were correct in our predictions, albeit a little conservative. My work here at the BDRA in creating and evaluating a learning space in the virtual world of Second Life suggests that the future is much more exciting than we had dared hope! We were, in fact, closer with an earlier paper (Vavoula et al. 2007) where we used part of a science fiction story from the 1960s to illustrate the future possibilities for technology-enhanced learning. The story is by Brian Aldiss and was written for a children’s science annual about a world, thirty years in the future, where children learn through guided project work rather than formal education.

The winter of 1963 - suitable year for an Antarctic experience. . . (© Copyright Richard Johnson - see link)

“…It was a simple thing to do. Many of the parts of the miniputer were synthetic bio-chemical units, their ‘controls’ built into Jed’s aural cavity; he ‘switched on’ by simple neural impulse. At once the mighty resources of the machine, equal to the libraries of the world, billowed like a curtain on the fringes of his brain…Its ‘voice’ came into his mind, filling it with relevant words, figures, and pictures. … ‘Of all continents, the Antarctic has been hardest hit by ice.’ As it spoke, it flashed one of its staggeringly vivid pictures into Jed’s mind. Howling through great forests, slicing through grasslands, came cold winds. The landscape grew darker, more barren; snow fell.” (Aldiss, B. 1963)

What we are finding with virtual worlds is that the “user’s” experience is remarkably real, setting in play relevant emotional responses and remaining in memory in many ways as though the experience had been real. Aldriss’s portrait of a virtual trip to Antarctica could be achieved today using virtual world technology.

“Bio-chemical” elements aside, if you were to take today’s virtual worlds back in time to 1963, I venture to suggest that Aldiss would agree we have already achieved his vision.

Paul Rudman, BDRA
.

Aldiss, B. (1963) The thing under the glacier. C. Pincher (ed.) Daily Express Science Annual No. 2, Norwich: Beaverbrook Newspapers Ltd.

Rudman, P. D., M. Sharples,  P. Lonsdale, J. Meek (2008). Cross-context learning. in Digital Technologies and the Museum Experience: Handheld guides and other media. L. Tallon and K. Walker. Lanham, MD, Alta Mira Press.

Vavoula, G. N., M. Sharples, P.D. Rudman, P. Lonsdale, J. Meek (2007). Learning Bridges: a role for mobile learning in education. Educational Technology Magazine. New Jersey, Educational Technology Publications, Inc. XLVII: 33-36.

A very real experience

I was at a club last weekend (in Second Life, of course…). It’s not that I have a particular liking for dancing puppets, but I do like meeting people from around the world, and a Second Life club is really rather good at facilitating that.

It was an ordinary club night,  DJ, friends, friends of friends, strangers. Around me, a conversation began along the lines of:
“How’s your arm?”
“Still sore”

When I enquired what had happened, her reply was well beyond what I expected.

It seems she had just returned from a visit to Japan. To the East coast North of Tokyo. Her cuts and bruises came from the earthquake. Her broken arm came from holding on to someone to stop them falling. She succeeded. Then they were hit by the water. Eventually, the Japanese army rescued them.

It was far, far worse than that description.

It is my good fortune that I live in the UK and wasn’t involved in the disaster, only hearing second or third-hand stories from commentators and videos. Listening to it first-hand suddenly pulled me into a reality I hadn’t been expecting. I used to imagine that the Japanese people understood earthquakes, that they would be fine. I don’t think they expected this one.

Paul Rudman
BDRA

Places in Second Life

A difficulty many people face in understanding Second Life is that it’s huge. It’s a virtual world, with thousands of different places. Visiting places randomly can leave new visitors with, well, a random impression. In today’s creative meeting here at BDRA, this was likened to following people on twitter who post “twaddle”, and assuming from that that Twitter is “rubbish”.

I recently discussed Second Life with an academic who was interested in using the virtual world for teaching, and sent him a list of places that I thought would be of interest. The list is just my personal choices, but they are, at least, places that are currently in use and say something interesting about how the virtual world is being used for education.

So I pass on the list, in case it is of interest.

Paul Rudman,
Beyond Distance research Alliance

PS: Places in Second Life often change, so this list may date quickly (it’s now January 2011). Also, neither myself nor BDRA specifically endorse these SL locations (apart from our own!)  – other locations are available… (more…)

Redevelopment of the Media Zoo Island in Second Life

In line with the launch of the new Media Zoo website, now with a great new banner designed by Emma, I have started work on redeveloping the Media Zoo Island in Second Life – with help from Paul, of course.

The island was built originally to serve as a place to showcase all the projects of Beyond Distance, as well as specifically to run the Second Life MOOSE project. As we move into 2011, and especially because of the requirements of SWIFT, the time has come to move the island into what I’ve called Phase II of its existence.

Part of the DUCKLING project, the oil rig has proved an popular artefact and simulation, and will be part of the 2010 JISC online conference. It is the main feature in the lagoon and, because it wasn’t needed for Phase II, we removed the boat house. The pier has been kept, and the motor launches used to ferry visitors and students to the rig will moored along it.

boat house

Selecting and removing the boat house

Another significant change in the lagoon has been to move and reshape the beach that formed such an important part of our 2010 Learning Futures Festival. The beach now forms part of the penininsula that contains the Saami tent.

beach

The new beach

The main work, however, will come with readapting what were the Safari Park and Breeding Area domes. One will be used to  showcase the projects from all four quadrants of the Media Zoo while the other – at the moment – is likely to become an auditorium/lecture hall/gathering place.

Phase II, which builds upon the success of Phase I, is partly about adapting the island to current needs, but also preparing it for future work and projects.

Join us on Media Zoo Island at http://slurl.com/secondlife/Media%20Zoo/171/102/25.

Simon Kear

Keeper of the Media Zoo

Austerity measures

We’re going to run out of prims.

Our little Media Zoo Island may not be “real”, in the original sense of the word, but it has always managed to have real effects on its visitors. Interest, inspiration, acquiring information, learning, even fun. But with every silver lining comes a cloud, a real effect we could do without – limited resources.

In the physical world, the talk is of “credit crunch” (a dated term already?), economic crisis, cuts. In the virtual world of our Media Zoo Island the limits are much more self-inflicted. We have embarked on a major project, SWIFT, and it’s testing the virtual world of Second Life to its limits. We want to display information in ways this virtual world was never designed for, we want animations that directly support each student’s learning needs at critical moments, and we want a virtual genetics laboratory where 30 students can each have all the equipment they need to practice screening genetic material for inherited diseases. That’s 30 sets of equipment, all in use at the same time.

New SWIFT lab in development

In a physical laboratory, one wouldn’t imagine trying this (at least, not without a multi-millionaire benefactor), but the virtual world is different. Not having to work within the laws of physics – such as time, gravity and cause-and-effect – makes it much easier to create machines than in the physical world. Of course, they only give the illusion of working, but that can be quite sufficient to generate an effective learning experience.

Yet even in the virtual world, there is a cost. Machines and other objects are created using “prims” – malleable building blocks that can be used to create surprisingly effective virtual objects. Even though something like a PCR Thermocycler takes only 44 of these prims, we need twenty such devices, thirty 12-prim UV Transilluminators – the list is long. With everything else on the island, it soon adds up to the 15,000 prim limit.

So, as everywhere, it seems that our virtual world will need some “austerity measures”. We’ve already found enough unused objects to release half the shortfall, and will redesign others to use less resources.

Reaching the limit of virtual resources is certainly not the biggest challenge for the SWIFT team, but it is, perhaps, one of the most contemporary.

Paul Rudman, BDRA

What I’ve heard just this week

Epic recently hosted a debate about eLearning at the Oxford Union. Diana Laurillard was among the speakers for the motion ‘This house believes that the e-learning of today is essential for the important skills of tomorrow’, whilst those against were led by Marc Rosenberg (I’m not sure which one). The motion was defeated on the day by 90:144, but I’m told the debate continues on the Epic site (where the vote is now in favour), via YouTube and on blogs such as those of Clive Shepherd and Stephen Downes. Let me know if you have time to catch up with this one, please.

Some of us saw and heard Martin Bean, the new VC of the Open University, speak at ALT-C. He will be giving a shortened version in Second Life (live, using an avatar) on December 16th at 3:30pm. The session will be chaired by Claudia L’Amoreaux (aka Claudia Linden), Education Programmes Manager for Linden Lab. The inworld audience will be limited to 50, but the session will be recorded and archived online for those unable to attend in person. If you are a confident avatar driver and would like to be in the inworld audience, you can send your name and avatar name to virtualworlds@open.ac.uk, with the subject line VC EVENT. First come, first served. Don’t all rush…

The latest Virtual World Watch report from John Kirriemuir (funded by Eduserv) is now available at http://tinyurl.com/ykscp77 He focuses this time on how institutions are choosing/have chosen their particular environment. Second Life features strongly, but there are references to other worlds and their capabilities and/or limits. I gather that the Open University is analysing in detail what’s needed to support learning and which worlds can best provide for it. This report is one I shall try to read for myself.

David Hawkridge