ArKit Object Persistence under the hood

Update: Since I wrote this extended tracking was added in Vuforia 7.2 and ARWorldMap was introduced in ARKit 2.0. While there may still be some viable use cases for registering AR with OpenCV markers those are likely better solutions for localized or persistent AR. I have also posted an example of the same technique in 100% native iOS code using ArUco.

With Apple’s iOS 11 release of ARKit and Google’s preview release of ARCore, it’s become much easier for developers to ship AR apps that don’t require printed markers. This has made free-movement world-scale AR possible on many current mobile devices and has led to a rise in impressive and useful new AR apps in the app store. When building my own AR apps there is one issue that I kept coming up against – Not only did I want to anchor an object to the real world but I wanted to register it to that space in such a way that I could persist object positioning on another device or at a later point in time. If I wanted to build a real-time multiplayer AR game that shares the same space with another device or need to give my users the ability load-in a previous furniture arrangement in a room designer app – well, this is problematic because ARKit’s SLAM tracking is not registered to real-world space.

Read more ArKit Object Persistence under the hood

ARKit object persistance demos

I’ve been working on a system to maintain object positioning between ARKit sessions. That means a user can place AR objects around a room and then return at a later time or on another device and see the objects anchored to the same locations. My system relies on printed markers that are processed with OpenCV to quickly register AR tracking to the initial position of a device in real-world space. Once that initial position is determined objects can be saved or loaded from a database and accurately positioned around the room. I think techniques such as this can go a long way to making AR more useful. I’m working on a much longer post on the actual implementation of this, but for now, I wanted to share two quick demo videos.

The first demo involves interior wayfinding. Ever been to a large home center store and just want to find that one item quickly? Well here is a solution that allows the shopper to simply scan an aisle marker and follow a path to the product of their choice.

The second demo involves annotating a space in AR. I think there are a lot of uses for this, but for now, consider the house guest staying at an Airbnb. Finding things around an unfamiliar kitchen can be a real pain, and here is a way AR can make this a lot easier.

Both of these are bare-bones demos but are still designed to be fully standalone apps. Users are able to create and print markers as well as generate their own paths and annotations and save those to the cloud all from within a single mobile app. We’ve tested similar versions of this app at work to help people find conference rooms, help party planners bang out their shopping list, and even teach owners about their new car.

Built with Unity using ARKit, OpenCV, and Firebase.

Building a Snap Chat Lens with Lens Studio

Just a few days ago Snapchat released Lens Studio allowing developers and 3D artists the ability create and deploy their own AR lenses. Today I sat down and had a crack at it and built my first Snap Chat lens – a very simple space scene inspired by perspective street art.

Len’s Studio is a bare-bones IDE with a lightweight javascript based scripting language and a familiar component-based GUI. All code is bound to scene objects through a small set of events and allow the developer access to touch interactions, camera position, animation controls, and reasonable API access to necessary components.
Read more Building a Snap Chat Lens with Lens Studio