Progress:
with both ends manually cleaned and reinstalled to 5.1.2, Jetson and Windows,
- 2-body fused FPS is now ballpark 30 where we would expect
- Fusion receiver with a single camera feed still changes the body ID when it moves across centerline (as does a 2-camera setup, although 2-camera changes it far more frequently). So this is still an issue, albeit no longer per se directly tied to fps/sync. Our downstream attempt at fusing similarly-located body IDs suggests that when ID changes on the 1-camera setup, there is a burst of frames where the data shows 2 alternating bodies with baseline positions more than a meter apart, although the rendered skeletons do not seem to diverge by that much.
Hey,
Would you mind sharing more information about your setup: network config, jetson configuration (jetpack, sdk, power model, was PTP used to sync jetsons?) Jetson fusion sender example would also be very handy. We are struggling to even to get ZED360 to display skeletons to do the calibration.
Thanks in advance.
Home with crud today so I canât physically check, but itâs mostly all spelled out above. ZEDBOX MINI I believe the 16GB Orin NX running whatever stripped down Ubuntu was stock on it. Latest SDK 5.1.2 and zedbox-mini camera drivers (which are their own thing; not the per-camera packs for other Jetsons). Local wired ethernet, firewall down at least for now, and no PTP because itâs hell on Windows. We have 2 software configurations which ought iirc to work out of the box (I donât think I had to change anything code-wise just to get it working):
- 1 or 2 instances of the sender sample c++ from the fusion reference page (you may need to click open a collapsed section to see it) sending on ports 30000/30002 directly into 1 instance of the zed-unity-fusion receiver (youâll need to either do a 360 calibration local on the Jetson and hand-edit the resultant json config to point across the network, or hand-point ZED360 on the receiver box at the Jetsonâs IP and camerasâ serials. For 1 camera, you need to 360 with 2 cameras then hand-remove the 2nd from the config.). This gets 30fps, but loses body ID very easily.
- 1 instance of the zed-unity-fusion application direct on the Jetson with a full-local 360 config, and a NodeJS script to catch the UDP stream output by the fusion sample and relay it to a more forgiving TCP socket that externals can subscribe to. This seems fairly solid on body ID, but only clocks ~20fps since the Jetson is doing more work.
Tracking & fitting settings are default, which iirc is off on sender and on on receiver,
NEURAL_LIGHT if on Jetson and probably NEURAL if on a real PC; HUMAN_BODY_ACCURATE regardless, 38 joint (although default 18 should be fine if you donât need more).