Most accurate tracking with VR Headset (ArUCO, Plane detection or only internal)

Hello,
I want to “put” a virtual object into the the “real world” captured by the ZED, mounted on an HTC Vive Pro 2.
The object just has to sit on a table, without any movement to it, with the headset being able to walk around it & get closer or farther away.
Now I tried using ArUco based marker tracking, however that seems to have a lot of flickering + seems to be mostly meant for being moved around.
With Plane Detection, as far is I understand, it only puts a plane mesh into virtual space, but doesn’t add any “tracking” but instead only relies on the internal tracking?

I was wondering for clarification & advice, should I use ArUco tracking for this, and if yes, are there any ways to make it more believable? Or is the internal tracking of the ZED mini + Vive Pro better? (haven’t gotten to try that out together yet, the tracking of ONLY the ZED seems unreliable though)

Hi,

Yes, the plane detection feature uses the internal tracking of the ZED SDK.

By default, in the Unity plugin, the camera tracking is fused with the headset tracking.
In some cases, the tracking of the ZED can decrease the overall quality of the tracking. You can try disabling the camera tracking (uncheck the “Enable tracking” option in the ZED Manager).
In that case, you will use the tracking of the headset only.

Stereolabs Support

1 Like

Thank you for your answer!
Concerning the ArUco tracking, how often does the Plugin “reposition” the marker, is it once per frame? Can I change how often? I am wondering because the tracking seems off considering how good the first position seems to be.

Hi,

Yes, it does the marker detection at each new image (this function : https://github.com/stereolabs/zed-unity/blob/master/ZEDCamera/Assets/Samples~/OpenCV%20ArUco%20Detection/Scripts/Core/ZEDArUcoDetectionManager.cs#L105).

Stereolabs Support

1 Like