Dactyl Nightmare.Do you remember the blocky virtual reality rig that was in malls back in the 90s? You put this large rig over your head and controlled movement with hand controls. The idea was to walk around while physically turning your body and moving your head. The virtual environment would shift as you moved.
Fast forward to January 9, 2007. Macworld and the unveiling of the iPhone. The iPhone brought augmented reality to the average consumer. Well, a few generations later brought augmented reality. We had to wait for the camera to get better and for the iPhone to get very accurate 3D accelerometers. With these technologies you can look at the world through the lens of the iPhone and have what ever you want projected onto it. The Yelp app can display restaurant reviews over the actual restaurants when viewed through the screen.
Google GlassGoogle Glass. It’s not officially out yet but it’s coming. Google Glass is a device that you wear like sun glasses and they display information in front of you overlaid on the reality that you can see. When tied to a sub-meter GPS (via bluetooth) the possibilities are endless.
For a while now I’ve envisioned recording sites while using Google Glass or some equivalent device. If you have the coordinates and any other information you want in an attribute table that the device can read there is no reason why that information can’t be displayed on the world in front of you. The device would have to know your current position (possible today), your height (possible today), and it’s exact orientation (possible today). That’s it. We can do this right now!
I used the iDraw App for iPad to sketch up some ideas. It seems reasonable that eventually we should be able to send real-time information while recording a site directly to the device wirelessly. As the GIS person completes a point or a polygon that information could go directly to the Crew Chief’s augmented reality device. I’m imagining walking around doing feature descriptions, looking up, and seeing a virtual representation of what’s been done. Check out the representation below.
Now, keep in mind that this is possible now. Or, well, when Google releases the glasses anyway. I heard the Google Glass SDK (software development kit) costs $1,000. So, if someone had that money, and the time, and of course the knowledge, they could develop this right now. It could be ready for when the glasses deploy.
To quote a song from the 80s, “My future’s so bright, I’ve got to wear shades.” The future of archaeology is so bright that we might be wearing augmented reality sun glasses before you know it. Are you ready? You’d better be.
Thanks for reading and I’ll see you in the field!