So I’ve played with a couple of the augmented reality apps available for the (non-jailbroken) iPhone — Yelp, with its Monocle functions, and Bionic Eye — and my conclusion is that navigation AR apps are completely pointless. You see, augmented reality takes into account your position and direction. What it doesn’t do — can’t do, until somebody figures out a way to digitize the dimensions of every building on the planet — is take into consideration your actual physical space. To an AR app, the world is flat, and landmarks or points of interest are simply vectors. When you open Bionic Eye and point it around, it does what’s advertised: it shows you badges for various points of interest like restaurants and coffee shops and tourist destinations. The app works perfectly well at this. The problem is that this information is rather pointless. If the POI is close enough for me to see it, I don’t need the app in the first place; if it’s not, simply knowing that there’s a McDonald’s somewhere in that general direction — on the other side of that block of office buildings, or that stand of trees — is not useful to me. More useful is the ability to see myself represented on an overhead map, with streets, and the location of the McDonalds as a point on the other end of a series of navigational instructions. This is, of course, technology we already have, and AR does not improve upon it in any useful way. So what is augmented reality actually good for? Quite a lot, I think. Foremost in my mind, AR serves as a sort of hypertext for reality, providing context and depth. When I was playing with ideas about AR back in 2002-2003, the example I used to friends was: you’re in New York. You see an interesting building. You hold up your AR device (which in my mind was a sort of hyper-PDA, at the time) and it tells you that this is the Chelsea Hotel. It provides you with a set of hyperlinks to information that can be overlaid on top of what you’re looking at — the Wikipedia entry, photographs of Sid Vicious being led out by police after (allegedly) murdering his girlfriend, links to songs about the place. In this way, your experience of this building in Manhattan is enriched, contextualized in interesting ways. Another example: perhaps you’re thinking about going into that sushi place across the street. You point your AR device at it, and it gives you the menu with prices, a list of reviews, the phone number for the place so you can make reservations if necessary. A simple, elegant layer on top of your world. There are other, more obscure, but equally interesting uses. A device which can take pictures and store in them the location, direction and angle of the camera can be used to create 3D versions of physical locations, at a fraction of the processing power used by similar initiatives that utilize existing imaging technology. A wonderful toy might be an AR app that scans Flickr for other photographs taken from your location and angle and overlays them onto your screen, showing you a kind of animated collage of what your physical location has looked like over time. Those possibilities are endless. One application which completely fascinates me — one that I’ve not heard anyone else mention before — is the idea of AR as a medium for narrative storytelling. This idea was probably first considered by Bruce Wagner in the strange and visionary TV mini-series Wild Palms, back in 1993. In the series, TV consumers buy a special box that sits on top of their TV set and “scans” their living room, noting the locations of furniture and doorways. The characters in the TV show Church Windows are holographically projected directly into your living room; they sit on your couch and ring your front door bell. You can watch them from any angle. (Part of the plot of the show is a tailor-made LSD variant that tricks your brain into thinking you’re physically interacting with them. I’d love to see somebody play with that IRL.) But why not an AR drama, played out on the streets of the city where you live? You can watch imaginary events unfold right in front of you; an imaginary meeting between doomed lovers on the street corner across from your apartment; a zombie apocalypse unfolding at Starbucks. The most obvious narrative is something reminiscent of Neil Gaiman’s Neverwhere, in which the inhabitants of a fictional London Below are roundly ignored by the dull inhabitants of London Above; that would be a great narrative trick to explain why the real people you view through your AR lens aren’t reacting or responding to the overlaid drama that commingles with them through the lens of your future phone. Of course, these are just a few possibilities off the top of my head; as the months and years progress, I’m sure we’ll see amazing ideas I haven’t even begun to consider come out of the AR universe. But I’m pretty convinced that the navigation tools that are our first examples of AR in the real world are lame ducks; what worries me is that their lameness will convince people to abandon the concept (much like virtual reality in the 1990s) before it really has a chance to show off.