I’m currently working on a drone / inspection project equipped with two ZED cameras (ZED X / ZED 2, etc.) — one camera is facing upward (to detect high obstacles or sky objects), and the other is facing downward (for ground or near-ground targets).
Both cameras are connected to the same Jetson device (e.g., Jetson Orin NX).
The goal is to fuse the measurements from both cameras — point clouds, object detections, and distance estimations — into one common reference frame (the drone body or world coordinate system), so that we can perform unified spatial analysis and collision or clearance evaluation.
However, in our setup, the two cameras have almost no overlapping field of view, which makes the usual stereo or multi-camera calibration process difficult.
I haven’t tried any existing multi-camera calibration methods yet, because I’m not sure what’s feasible in a non-overlapping case.
So before I proceed, I’d like to ask for advice from the StereoLabs team or anyone who has experience with similar setups.