Questions about the effect of depth_sensing

Hi!
Recently I had a problem inverting the depth of the water. I found that the depth_sensing.py doesn’t run as well as the Zed Depth Viewer inversion, how do I set it up so that the depth_sensing.py works like the Zed Depth Viewer?
Zed Depth Viewer:


depth_sensing.py:

hi,
you can check the parameters used in DepthViewer in the option widget (the gear icon on the top right corner).
You will be able to update your sample with the same parameters, which are a combinaison of sl::InitParameters (DEPTH_MODE) and RuntimeParameters(SENSING_MODE, confidence) …

Thanks for your reply!
I can understand that as long as the camera settings are the same, can the two methods achieve the same effect?

Yes, our tools are simply fancy UI on top of the C++ API, no post treatment in it.

Thank you for your reply!
I want to know if the user interface is open source, and I want to learn how to control the camera and the rendering part.

Hi,
We don’t share the code of the ZED tools.
You can use the tools of your choice to build a UI and add the SDK code to it.

Edit: some of the samples also have a basic UI using openGL with a camera, you can take a look.