Is there a way to retrieve all depth values in Unity?

Hello Stereolabs team and community!

I’m developing a game that consist on players throwing physical balls to targets in a wall, and for that I thought about using the depthmap functionallity so I can detect when a ball hits a target. Mapping the targets beforehand and knowing the position where the ball impacted the wall, I can know if it impacted a target.

My initial implementation used shaders. For the tests, I grabbed the texture inside ZED_Rig_Mono->Camera_Left->Frame and I setted it to a custom shader. In that shader I tried to get only the pixels closer to the wall, but the output was a bit unstable and imprecise. I think part of this was due to the depth normalization, that makes the whole depthmap to change when a new object appears in the camera. (Check the image attached in this post for a super clear example (thanks @Neeklo!))

The red square is the ball going towards the wall
image

So I tried a second approach, building my own depthmap using the real depth values. But the only method I found was ZEDCamera->GetDepthValue, which only retrieves the real depth values pixel by pixel. The resolution I use is 1920x1080, and iterating across all that data takes around 11 seconds. I tried downscaling the final texture eight times and I got something that could work, taking only 0.165 seconds.

But I was wondering, is there an easier and quicker way to do this?
Something like the GetDepthValue, but instead returning the data pixel by pixel, returning an array of floats to speed up the process.

Thanks!

Hi,

You can have access to the depth texture here : zed-unity/ZEDCamera/Assets/ZED/SDK/Helpers/Scripts/Display/ZEDRenderingPlane.cs at master · stereolabs/zed-unity · GitHub

This is the texture used in the shader to compute the occlusions.

Hey @BenjaminV

Thanks for your answer! But as I said above, the depth normalization mess a bit with the values I need.
I need to be able to process the image so it only displays the values between two given distances, which is hard to do with the depth texture for the reasons I mentioned in the first message.

What I would like is to access to the information where the depth data is stored, as I access using ZEDCamera->GetDepthValue, but instead of retrieving pixel by pixel, retrieving all the information at once.

Would something like that be possible?

The depth texture I mentioned in my last message should not be normalized.

When you did your test, what depth texture were you using exactly ?

Sorry @BenjaminV, I thought all depth textures were normalized.

I grabbed this texture but I don’t understand the results.

The red image is a quad with the Depth texture from the ZEDRenderingPlane, and the big one a RawImage that renders the RenderTexture created in ZED_Rig_Mono->Camera_Left, with the View Mode setted as VIEW_DEPTH

Shoudn’t the depth texture you mentioned render all the elements that can be seen at the bottom of the grayscale image? Or are those elements hidden because the values are greater than 255?

If so, is there a formula to convert the RGB values to meters?

Thank you!

Update:
Looks like the rest of the elements that do not appear in the red image have values bigger than 255.
I used the step node in my shader and I could render them


So the final question @BenjaminV would be asking for the formula to translate RGB values to the metric unit values (meters, in this case)

Hi,

You can’t display the depth texture like that, it is a texture of format “RFloat”. The value of each pixel is the depth in meters, there is no conversion to do.

Hey @BenjaminV

Thanks, with all that information I could finally implement a proper solution.

I created a custom unlit shader, assigned it to a material, and with graphics Blit I get a renderTexture with the information processed.

Graphics.Blit(_zedRenderingPlane.Depth, _generatedTexture, _arturoTestMaterial);

Shader "Arturo/DepthFilterUnlitShader"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
        _Greater("Greater", Range(0, 4)) = 0.5
        _Less("Less", Range(0, 4)) = 1.5
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            void GreaterOrEqual_float(float A, float B, out float Out)
            {
                Out = A >= B ? 1 : 0;
            }
        
            void LessOrEqual_float(float A, float B, out float Out)
            {
                Out = A <= B ? 1 : 0;
            }

            void Unity_And_float(float A, float B, out float Out)
            {
                Out = A && B;
            }

            sampler2D _MainTex;
            float4 _MainTex_ST;

            float _Less;
            float _Greater;

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                float2 flipped_uv = float2(v.uv.x,1 - v.uv.y);
                 o.uv = TRANSFORM_TEX(flipped_uv, _MainTex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                float _MainTexRedChannel = tex2D(_MainTex, i.uv).r;
                float _GreaterComparisonOutput;
                float _LesserComparisonOutput;
                float _AndOutput;

                GreaterOrEqual_float(_MainTexRedChannel, _Greater, _GreaterComparisonOutput);
                LessOrEqual_float(_MainTexRedChannel, _Less, _LesserComparisonOutput);
                Unity_And_float(_GreaterComparisonOutput, _LesserComparisonOutput, _AndOutput);
                 
                return _AndOutput.rrrr;
            }

            ENDCG
        }
    }
}

As the value of each pixel is the depth in meters, I can filter the data I want to show comparing the minimum and the maximum distance, and then show only the pixels in both operations.

I had to flip the UVs in this line float2 flipped_uv = float2(v.uv.x,1 - v.uv.y); because the depth texture in the ZEDRenderingPlane.cs was flipped.

Result:

I hope this helps if someone finds the same problem.

Thanks a lot @BenjaminV for your support during this journey!

1 Like