ARKit object persistance demos
I’ve been working on a system to maintain object positioning between ARKit sessions. That means a user can place AR objects around a room and then return at a later time or on another device and see the objects anchored to the same locations. My system relies on printed markers that are processed with OpenCV to quickly register AR tracking to the initial position of a device in real-world space. Once that initial position is determined objects can be saved or loaded from a database and accurately positioned around the room. I think techniques such as this can go a long way to making AR more useful. I’m working on a much longer post on the actual implementation of this, but for now, I wanted to share two quick demo videos.
The first demo involves interior wayfinding. Ever been to a large home center store and just want to find that one item quickly? Well here is a solution that allows the shopper to simply scan an aisle marker and follow a path to the product of their choice.
The second demo involves annotating a space in AR. I think there are a lot of uses for this, but for now, consider the house guest staying at an Airbnb. Finding things around an unfamiliar kitchen can be a real pain, and here is a way AR can make this a lot easier.
Both of these are bare-bones demos but are still designed to be fully standalone apps. Users are able to create and print markers as well as generate their own paths and annotations and save those to the cloud all from within a single mobile app. We’ve tested similar versions of this app at work to help people find conference rooms, help party planners bang out their shopping list, and even teach owners about their new car.
Built with Unity using ARKit, OpenCV, and Firebase.