If I understand correctly, you want to retrieve the depth in millimeters in a UINT16, so I think you want to use sl.MEASURE.DEPTH_U16_MM. I would also initialize the sl.Mat explicitly like here: Depth Sensing Sample L. 92
I have tried it, but the values are still zeros. link here is the link to my script, on the Orin end I just used the original camera_streaming script here. Should I convert to GPU mem? Please take a look.
You should swap the 1920 and 1080 values, it’s supposed to be width first. Alternatively, if you want to retrieve in this resolution by scaling, it’s possible but I think you need to specify it to the retrieve_measure (see API reference). Essentially, the retrieve_measure call should match with your sl.Mat.
Then, can you make sure if there’s data in depth_mat or not, by printing some get_value() for instance?
If there is, this will be a conversion issue of some sort, else we need to look more into the initialization.
I am new to this and need some guidance. I want to use a Jetson Nano and have the camera send data to MATLAB for real-time data collection for my project. Can I use this code to connect the camera data to MATLAB, or do I need to use an API from here: Using the Depth Sensing API - Stereolabs?
I have updated the script as you suggested, but the values are still always zeros. I have tried to change the coordinate_units and depth_maximum_distance etc, but nothing changes. the debug output is always [zed_streaming_rgbd-1] depth_value: (1080, 1920) 0.0 0.0 [zed_streaming_rgbd-1] center_depth_value: 0 please take a look. here is the latest code I use.