Positioning and occluding objects in AR with photogrammetry
I’ve been working on a demo project that involves putting a pair of sneakers on a statue in a mobile AR app. This presented some significant challenges typical to this type of project – mainly registering AR tracking to the physical space and then precisely fitting the shoes around the feet and ensuring that they were accurately occluded by the legs.
First, here is a quick demo video. Apologies for the choppy frame rate. The screen recording process really hurts performance.
For this demo, I did not want to rely on a printed marker. Since the recent beta for ARKit 2 supports 3D trackable objects and a serializable world map, my initial hope was that I would be able to use the natural features in the environment for localization. But after a few tests, I could tell that this was not going to work well because the statue – cast in bronze, has minimal contrasting surface detail and a fairly reflective surface. Further, relocalizing to a saved point cloud of the surrounding environment was slow, did not work reliably in a dynamic outdoor environment, and lacked the precision required for this effect.
Read more Positioning and occluding objects in AR with photogrammetry