ZED Mini Tracking Latency & Unity

Hi All,
Have been playing with a ZED Mini tracking to support the of rendering virtual objects on a see-thru display. I’ve noticed that the round-trip latency is quite large and hence there is quite a lot of ‘swimming’ of virtual objects against the real world view (note this is not a traditional mixed reality application).

Is there anything that can be done (perhaps utilizing the internal IMU gyros) to reduce the latency of the reported tracker position and orientation? I note that there there are references to timewarping, etc… in the documentation but I think this only applies to XR applications?

Alternatively, could I gain access to the IMU gyro rates to add compensation/prediction myself? I am already running the tracker at 100FPS via the Unity ZEDManage component.

Thanks for any pointers.
Dave

Hi @DaveyDave321,

The timewarping only applies to XR applications indeed.

You can access the IMU data & actual refresh rate through this method in the SDK, mirrored in the Unity plugin as GetInternalSensorsData.

We just noticed that the Unity plugin is missing the effective_rate in the class IMUData, but it still comes with the timestamp so you should be able to calculate it at runtime. We’ll add it in the next release.

The IMU sampling rates are given in the camera datasheets, for the ZED Mini it’s 800Hz, so running in a separate thread, you should be able to have access to IMU data at this rate. The positional data itself calculated from both IMU and visual odometry is updated at about 100Hz.

Do not hesitate to ask more questions if you have some, or if I was not clear or misunderstood.

I’m eager to see what this project will look like, please share some visuals if you can!

Jean-Loup

1 Like