Hey,
I’m working with a setup of 2× ZED X One (Narrow) cameras connected via a ZED Link Duo card on a Jetson Orin Nano, Using the last version 5.2.3 SDK.
I want to get the distance (depth) of detected objects when using this virtual stereo configuration.
I’m not finding clear documentation or examples on how to detect an object and retrieve the distance of it from the cameras.
Any guidance or example code would be really helpful.
Niv
Hi,
Thanks for reaching out.
Please follow these instructions:
- https://www.stereolabs.com/docs/cameras/zed-x-one/zed-x-one-stereo
It provides information on virtual stereo calibration and use with our SDK, as well as links to SDK samples for virtual stereo cameras.
Hope this helps.
Stereolabs Support
Hi again,
Thank you for your response and for providing the link to the virtual stereo setup documentation.
I have reviewed the documentation and the sample code in the GitHub repository, but I’m having some difficulty finding examples that specifically demonstrate how to use the dual ZED X One cameras in a virtual stereo configuration for both object detection and depth retrieval. The current examples seem to only cover how to open the virtual stereo cameras using the two ZED X One units but don’t provide much detail on utilizing the full capabilities of the virtual stereo setup, such as detecting objects and obtaining their distances.
None of the samples cover the dual ZED X One cameras in a virtual stereo configuration. While there are examples for other ZED models, I couldn’t find any that are tailored to the dual ZED X One setup, particularly for object detection and distance calculation.
Do i need to perform adapt the existing examples for other ZED models to work with the dual ZED X One setup?
I appreciate your support!
Best regards,
Niv
Hi @NivC,
Once a virtual stereo system is configured using the sl::InputType::setVirtualStereoFromSerialNumbers (as in this example: zed-sdk/virtual stereo/cpp/src/main.cpp at master · stereolabs/zed-sdk · GitHub)
You can use all of the methods available in the sl::Camera class.
This means that object detection samples like this one: zed-sdk/object detection/birds eye viewer at master · stereolabs/zed-sdk · GitHub would only need for you to update the configuration step mentioned above, to be able to use the object detection features on the virtual stereo setup.