How to retrieve the whole depth image

Hi stereolab,

I am using steaming module to retrieve depth image in PC from Orin using zedx, I followed the sample code,

depth_mat = sl.Mat()
cam.retrieve_measure(depth_mat, sl.MEASURE.DEPTH) #Retrieve depth image
depth_value = depth_mat.get_data().astype(np.uint16)
print("depth_value: ", depth_value.shape)

the shape is correct but all the values are 0
Here is how initialize the params

init_parameters = sl.InitParameters()
init_parameters.depth_mode = sl.DEPTH_MODE.NEURAL_PLUS
init_parameters.coordinate_units = sl.UNIT.MILLIMETER        
init_parameters.depth_maximum_distance = 2.0
init_parameters.depth_minimum_distance = 0.3
init_parameters.sdk_verbose = 1
init_parameters.set_from_stream( self.ip_address.split(':')[0],int( self.ip_address.split(':')[1]))

The valuses are nan
Please let me know what I did wrong

Hi @Jiahe

If I understand correctly, you want to retrieve the depth in millimeters in a UINT16, so I think you want to use sl.MEASURE.DEPTH_U16_MM. I would also initialize the sl.Mat explicitly like here: Depth Sensing Sample L. 92

depth_mat = sl.Mat(res.width, res.height, sl.MAT_TYPE.U16_C1, sl.MEM.CPU)
cam.retrieve_measure(depth_mat, sl.MEASURE.DEPTH_U16_MM)

Were you using the Depth Sensing sample?

no, I was using camera streaming python code, I will try this later

1 Like

Hi @JPlou

I have tried it, but the values are still zeros. link here is the link to my script, on the Orin end I just used the original camera_streaming script here. Should I convert to GPU mem? Please take a look.

@Jiahe can you try to open ZED Depth Viewer and verify if the NEURAL Depth mode produces good depth maps?

I have checked Depth Viewer, it works fine, I can get the values by retrieving pointclouds, but I cannot directly retrieve depth.

1 Like

@Jiahe Are you using ROS or native Python code?

Are you sure that this works as expected?
You are handling 16bit data, not 8 bit

I output the depth_value min and max, that value is wrong anyway.

Are you handling the data as 16 bit information instead of 8 bit?

this is current solution

yes, here is I load it as uint16, and two lines later, I output the value the values are always 0

Hi again, I see you initialized the sl.Mat with:

depth_mat = sl.Mat(1080, 1920, sl.MAT_TYPE.U16_C1, sl.MEM.CPU)

You should swap the 1920 and 1080 values, it’s supposed to be width first. Alternatively, if you want to retrieve in this resolution by scaling, it’s possible but I think you need to specify it to the retrieve_measure (see API reference). Essentially, the retrieve_measure call should match with your sl.Mat.

Then, can you make sure if there’s data in depth_mat or not, by printing some get_value() for instance?

If there is, this will be a conversion issue of some sort, else we need to look more into the initialization.

Hello @Jiahe,

I am new to this and need some guidance. I want to use a Jetson Nano and have the camera send data to MATLAB for real-time data collection for my project. Can I use this code to connect the camera data to MATLAB, or do I need to use an API from here: Using the Depth Sensing API - Stereolabs?

me too,do you find it out?

Hi @Ibrahimsh104 @Hua2544

If you’ve read through it already, sorry, but we have a Matlab interface documented here: How to Use Matlab with ZED - Stereolabs

That being said, it’s not compatible with streaming out-of-the-box.
I think the easiest way to make it so is to modify around this part of the code to call setFromStream() (similarly to the call to setFromSVOFile()).

Hi again,

I have updated the script as you suggested, but the values are still always zeros. I have tried to change the coordinate_units and depth_maximum_distance etc, but nothing changes. the debug output is always [zed_streaming_rgbd-1] depth_value: (1080, 1920) 0.0 0.0 [zed_streaming_rgbd-1] center_depth_value: 0 please take a look. here is the latest code I use.

never mind, problem solved. I think it’s a hardware problem, when I use a laptop, it always fails, but a desktop with a powerful GPU works.