I think we’re getting to the point where, by now, many of you will have found it hard to avoid hearing about the latest technology buzz. No, not Cloud Computing. This technology is, shall we say, more ‘tangible’. It is – quite literally – technology that you can hold in your hand.
Yes folks, by now many of you will be aware of this growing buzz around the rather snazzy and futuristic sounding ‘Augmented Reality’ (AR). The headlines are growing, the clamour is getting more excitable by the day and even though it’s only really hit the public consciousness relatively recently, I don’t think we’re far away from that glorious, early-doors hype bubble popping to the sounds of “well…there’s not many apps!”, “is it just for restaurants and tube stations?” and “it’s not a game-changer, it’s a fad!”. For that last one just look at the James Cameron’s Avatar 3D story (and nobody’s even seen it yet!)
Still. It’s an exciting technology and one that – like Cameron’s 3D in cinema – will be a game-changer (imo), not simply a fad, given the opportunities it will open up to enable and enhance the immersive delivery of rich content to mobile platforms. As with anything though, it’s not going to happen overnight. Right now the bugbear with 3D is the lack of supporting cinema screens and similar applies to AR capable devices. However, that will inevitably change.
There are 2 types of AR applications at the moment – Mobile & Desktop. Desktop uses marker-based images to create animations (both 2D & ‘3D’) and – in some cases – also include interactive controls. I’ll cover this in a future post I think. For this brief post though I’m talking about Mobile AR. This is where an application uses your phone’s GPS to know where you are and its magnetometer – or more simply, the digital compass – to know which way you’re facing. Couple those together then add in the live video feed coming through your phone’s camera and bingo! We have location aware data overlayed on your image of the world around you.
At the moment I’ve got 3 AR apps on my iPhone – Robotvision (http://robotvision-ar.com/), Wikitude (http://www.wikitude.org/) & Layar (http://layar.com/). I’ll write a post in which I cover these soon but for now the obvious question is simply, “well, how could they be used in a learning activity?” – Oh and yes, I’m excluding ‘learning where the nearest Costa Coffee is’ from my criteria. Now I’m no teacher but I can see this…
Imagine, you’re studying Local History and looking at the changing architecture and layout of the town centre. You stand on a street, point your phone* at a scene and overlayed on the live image are archive photographs of the location spanning the decades. Touch the image of the Town Hall and you’re given the option of viewing a Flickr pool, visiting the town hall’s website, or going to the Wikipedia page and reading up on the history of the building. Buildings, street views, whole town layouts perhaps..
* I must point out that I use the term ‘phone’ very loosely here. What I really mean is the “smartphone”, the pocket sized computer. The gadget in my pocket that is already more powerful than the old PC I have in the back bedroom.
Or let’s say you’re on a Geology field trip, trekking around the Isle of Wight. Pointing your mobile device (see, we’re evolving already!) at a nearby outcrop and there on top of the scene, through your camera, are some controls that will display detailed information about what you’re looking at. That it’s made of sandstone perhaps, that it’s river formed as opposed to wind. The difference between river and wind formations!
So…AR meets Social Media meets The Cloud you could say. Like I say, there’ll be further posts around this from me – where I’ll attempt to look a bit closer at the applications out there and my thoughts on them.