How to combine two different tracking system coordinate

We are now using 2 Zed 2i cameras to run body tracking fusion.

And we want to combine the body tracking of the ZED camera with the positioning of the SteamVR lightinghouse to perform multiplayer operations in VR. We would like to know how to combine the coordinate system of the ZED Camera with the coordinate system of the lightinghouse.

Any help would be greatly appreciated.

Hi,

During the calibration process, using ZED360, one camera, the first to be loaded is defined as the world origin, and its position will be (0, H,0) with H being its height.

The challenging part will be to get the relative position of the VR system from this origin.
I’m not sure what would be the best solution to achieve this. Maybe you can print an Aruco and put it on the Lighthouse, then perform Aruco detection from the “main camera” (using this sample for example : https://github.com/stereolabs/zed-aruco/tree/master/mono).
That way you will have the transform from the camera to the lighthouse, and deduce the transform from the origin to the lighthouse.

Best,
Benjamin Vallon

Stereolabs Support