Zed-unity plugin - stereo rendering - AR passthrough for Quest 3

Hello, Benjamin !
I hope you’re doing well. I wanted to share an update on our end regarding the Zed Camera plugin:

Object Scaling & Stretching Fixed

We resolved the horizontal stretching issue by replacing a single method in ZedRenderingPlane.cs.
The scaling discrepancy turned out to be primarily due to the physical setup of the Zed camera: its mounting angle, distance, and alignment significantly affect perceived scale in the headset.
Everything now renders with correct proportions.

New Issue: Low Streaming Frame Rate

We are now seeing only 15–30 FPS in the headset during streaming, regardless of connection method (USB Rift Link or Air Link).

We have already tried:

Switching Depth Modes (Performance, Quality, Neural). Neural gave us the lowest FPS out of these 3.
Toggling Enable Streaming Output in the Unity Inspector (played with the bitrate, target fps, gop etc. etc.)
Using both USB and Wi-Fi connections for our Oculus 3
None of these changes improve the frame rate. Latency remains acceptable (~0.5 s on default settings), but the low FPS makes the experience choppy.

Could you advise:

What factors in the plugin or streaming pipeline might be limiting our FPS?
Are there additional settings—either in the plugin or Unity—that we should adjust?
Any recommended best practices for achieving higher frame rates?

And finally, how can we manually adjust object scaling in code? Are there parameters within the plugin (ZedRenderingPlane.cs for example, that allow control over the scale of rendered objects? Maybe we can add some variable/variables to control that?

Thank you again for your ongoing support and for all the work you’ve put into the plugin. We appreciate any insights or suggestions you can share.

Hi,

Thanks for the feedback.

How many fps do you have on your ZED Box? Can you try to open ZED Depth viewer on the box ?

This will tell us if the issue comes from the box or the stream of the data.

I think that you can modify the scale of the object by changing the plane distance (https://github.com/stereolabs/zed-unity/blob/master/ZEDCamera/Assets/SDK/Helpers/Scripts/Display/ZEDRenderingPlane.cs#L519) and its size.

Thanks.

Stereolabs Support

Hi, Benjamin

First of all, thank you for your prompt feedback on the low-FPS issue during our VR streams. After updating to the latest MetaLink version, the discrepancy disappeared—FPS reported in Zed Box now matches what we see in our build.

I wanted to update you on our scaling experiments and ask for advice on the remaining “object size” behavior:

  1. Steps we’ve taken to adjust object scale:

Field of View (FOV) tweaks
– Scaled cam.fieldOfView by 10% in ZEDReady().

Projection matrix injections
– Applied a zoomFactor to newmat[0,0]/newmat[1,1] in SetProjection().

Plane distance & scale
– Tried changing the rendering plane’s localPosition.z (distance) and its transform.localScale.

World & camera rig adjustments
– Experimented with scaling a WorldRoot container and offsetting the camera parent.

Canvas scaling
– Ultimately, we achieved the most noticeable effect by scaling the ZED rendering canvas itself by 0.9–1.1×.

  1. Observed result:
    While canvas-scaling does make everything approximately 10 % “smaller,” it also introduces a visible “window-in-window” effect—black or empty borders appear around the edges, framing the view.

  2. Remaining concern:
    It still seems that close objects are unnaturally large while distant objects shrink dramatically. We suspect this is simply a consequence of the camera’s standard perspective (division by z), but we’re hoping there’s a way to achieve a more uniform perceived scale across different depths without the framing artifact.

Could you advise whether this depth-dependent scaling is expected in the ZED VR pipeline, or suggest a preferred approach for getting a consistent 10 % shrinkage of all depths without visible borders?

Thank you again for your help—looking forward to any recommendations you can share.