Implementing Vuforia Extended Tracking Without a Marker Source

1.4k views Asked by At

I am developing an augmented-reality app to be used on both Google's Project Tango tablet, and on ordinary android devices. The AR on the normal devices is being powered by Vuforia, so its libraries are available in the development of the app.

While the Tango's capabilities offer a unique opportunity to create a marker-free AR system, the Pose data has significant drift that makes it difficult to justify Tango development due to the data's instability.

When Vuforia was being researched for eventual inclusion into the app, I came across its Extended Tracking capabilities. It uses some advanced Computer Vision to provide tentative information on the device's location without having the AR marker onscreen. I tried out the demo, and it actually works great. Fairly accurate within reason, and minimal drift (especially when compared to the Tango's pose data!)

I would like to implement this extended tracking feature into the Tango version of the app, but after viewing the documentation it appears that the only way to take advantage of the extended tracking feature is to activate it while viewing an AR marker, and then the capability takes over once the marker disappears from view.

Is there any way to activate this Extended Tracking feature without requiring an AR marker to source its original position, and simply use it to stabilize and correct error in the Tango's pose data? This seems like the most realistic solution to the drift problem that I've come up with yet, and I'd really like to be able to take advantage of this technology.

1

There are 1 answers

0
BZC Development On

this is my first answer on stack overflow, so I hope it can help! I too have asked myself the same question for vuforia, as it can often be more stable with extended tracking than with a marker, like when far from a marker, or/and at an angle for example, it can be unstable, if I then cover up the marker, therefor forcing the extended tracking, it works better! I've not come across a way to just use extended tracking, but I haven't looked very far.

My suggestion is that you look into maybe using a UDT (user defined target) In the vuforia examples, you can find how to use UDT. They are made so that the user can take a photo of whatever he likes as a target. but what you could maybe do, is take this photo automatically, without user input, and the use this UDT, and the extended tracking from the created target.

A suggestion I thought useful. Personally I find the tracking of the tango amazing and much better than vuforia's extended tracking (to be expected with the extra sensors) But i suppose it all depends on the environment.

Good luck, hope this suggestion could work,

Beau