As far as I know, if I’m going to use fusion body tracking in Unity, it’s only possible on liveink.
I’m using Jetson nano and if I want to use fusion body tracking using Jetson nano, do I have to link multiple zed cameras to one jetson nano and send the fused data to the pc where the Unity project runs?
Or is it a structure where I have to have multiple jetson nano, connect one zed camera to each jetson nano, and then put the data together using network fusion on the main pc where Unity is running and send that data to Unity project?
Even if in theory both solutions are possible, I’d greatly recommend using one jetson per camera, otherwise the performances will be very low.
To be able to send the body tracking data from the Jetsons to a “server” that will fuse the data, you need to :
On each Jetson, run a program that performs the skeleton tracking and “publishes” the data (via the {startPublishing method). I added a code snippet in the attachement that does all of that.
Calibrate your system using ZED 360 (ZED360 - Stereolabs). It will generate a calibration file you need to use to fuse the data from each Jetson.
On the receiving computer (the “server”), you need to run the Fusion livelink sample with the calibration file generated by ZED 360. It will fuse the data and send them to Unity.