Markerless AR with a Project Tango
I have recently been experimenting with a Project Tango device, primarily in its use as a possible technology for Augmented Reality and interior wayfinding – a topic that routinely comes up with some of our retail clients. In past AR projects I have used marker images which naturally contain patterns that can be recognized and interpreted by a computer vision algorithm to determine the orientation of a device in space in relation to the target image. For the most part, this is quite effective but there are limitations. Aside from the need for a printed image, target occlusion, tracking speed, and lighting conditions are all potential pitfalls. The Tango, an experimental android device released by Google, has depth-sensing cameras that help determine device orientation and create a dimensional data map of the physical environment. The device can then orient itself in part by matching the current feed from the depth cameras against this map. This gives us essentially markerless AR with a much-improved ability to track objects to physical geometry and scale.