So the iPhone 3GS is about to drop. (I just got an iPhone 3G, so I’m eligible to return it and get this new one.) This is what’s most important about it:
GPS (which determines the device’s location) + digital compass (which determines the device’s current direction it’s being pointed in) + tilt sensors (which determine what angle the device is pointed at) + video camera = augmented reality.
The screen shows what the video camera is “seeing”. You point it at a building. The sensors “know” you’re at longitude X latitude Y, facing 35 degrees from true north, phone pointed directly ahead. It queries Google Maps, finds out what building/object is closest to longitude X / latitude Y, at a 35 degree angle.
It puts that information on the screen, overlaid on top of whatever you’re pointing your camera at. What’s this building? It’s the Chelsea Hotel. Click here for the Wikipedia entry, or the Yelp entry. Here’s a photostream of Sid Vicious being escorted out of this building in 1978 after allegedly murdering his girlfriend. Oh, he walked right by where you’re standing.
That person walking in front of you? They have Supermagical Bluetooth Person App open. That’s Susie. She’s also a big Arcade Fire fan.
Do you see where I’m going with this? Always on augmented reality.
There’s also the fact that, if you can store position, direction and angle in DCIM, every picture taken in any given space with an iPhone 3GS can be used to recreate that space in 3D. If you can pull the focal point and direction out as well, you might even be able to actually recreate 3D objects from a collection of 2D images. Like they did in the Matrix sequels, only without the massive budget. No processing. Just calling up pics and arranging them via their own metadata.
I now have a nerd boner.