How to stream ZED X video to Meta Quest 3 with Unity

Hello,

I’m currently working on a project using a ZED X camera with a ZED Box Orin NX for streaming. Since I’m quite new to this field, I’ve been struggling for a few weeks to set up the system properly.

My goal is to stream the video captured by the ZED X to a Windows PC, and from there display the stereo video on a Meta Quest 3 headset (eventually handling SBS 3D video).

I’ve tried several approaches, including Unity development, different open-plugins, and the ZED plugin for Unity, but so far I haven’t been able to figure out a clear path forward.

Could anyone share some advice or point me in the right direction on how to properly handle this workflow? Any guidance or references would be greatly appreciated.

Thank you very much in advance.

Hi,

The ZED Unity plugin (GitHub - stereolabs/zed-unity: ZED SDK Unity plugin) should allow you to do so.
What issues did you encounter when you tried this approach?

Best,

Thanks for the quick reply!

I have a couple of questions:

  1. Regarding installation of the ZED SDK Unity Plugin – which method is correct?

    • Adding the package from Git URL via Unity Package Manager (as in the instructions), or

    • Adding the whole git project to Unity Hub (e.g., zed-unity-5.0.1 > ZEDCamera).
      Also, is there a recommended Unity version, or does it not matter?

  2. My goal is to stream video from a ZED Box Orin NX to Quest 3 via RTSP (or WebRTC).

    • The ZED_Rig_Stereo asset’s inspector in Movie Screen samples,
      (ZED Manager) I set Input Type: Stream with the RTSP IP (port 8554), but it didn’t work.

    • For reference, the following GStreamer pipelines work between the Jetson server and a Windows client:

      • Server: gst-zed-rtsp-launch -a 192.168.2.10 zedsrc ~

      • Client: gst-launch-1.0.exe rtspsrc location=rtsp://192.168.2.10:8554/zed-stream ~
        These pipelines are long because they use NVIDIA HW encode/decode for low latency. Can I use them as-is, or should I simplify the pipeline first for initial testing?

Any advice would be very helpful. Thanks again!

Hi,

In order to be able to open a stream from ZED Unity plugin directly, the video needs to be streamed via the streaming module of the SDK (see : Local Video Streaming - Stereolabs) using our streaming sample for ex : zed-sdk/camera streaming/sender at master · stereolabs/zed-sdk · GitHub.

Then, you should be able to open the camera “as stream” by specifying the jetson’s IP and the port (which is 30000 by default).

Okay, thank you for the reply.

Just to make sure I fully understand: with the ZED SDK streaming module, It is possible to send live video from the Jetson, then receive and open it directly in Unity on the Quest 3, correct?

I also have a few follow-up questions:

  • Is it possible to stream outside a local network (e.g., to a remote site) with minimal latency?

  • Can I configure parameters such as frame rate, resolution, or mono/stereo modes (e.g., side-by-side or top-bottom)?

  • Does the SDK rely on RTSP under the hood, or is it a different protocol? Could the stream be received by a non-ZED client?

  • Finally, does the SDK provide a way to view the stream in stereo inside the Quest (e.g., left camera image mapped to the left lens)? Or is it in the examples in ZED Unity Plugin?

Thanks again for your support!

Hi,

Yes, that’s right.

  • The streaming module only allows you to stream on the local network. To stream on a remote site, you’d need to use a vpn, but we do not guarantee a low latency then.

  • No it’s not possible to receive the stream outside of the ZED SDK. An alternative would be to use gstreamer instead of the SDK (Getting Started with GStreamer and ZED - Stereolabs)

  • The ZED Unity plugin natively supports this. So as long as you are using the plugin (and the SDK), it should work without any additional work

Hi,

Thank you for the detailed answer.

Just to confirm: with the ZED Unity plugin, it’s only possible to receive streams from the ZED SDK over a local network, correct? If so, the plugin alone won’t work for my remote streaming case.

Still, I’d like to use the plugin’s stereo display capability on Quest 3. Is there any workaround for that — for example, grabbing the stream another way and having the ZED Unity SDK recognize it as a ZED camera?

Also, from what I understand, for remote streaming I’d need to rely on GStreamer (or is it possible to making my own codes using RTSP streaming with the ZED API?). Then in Unity, I’d either have to connect the stream myself or integrate it with the ZED Unity SDK.

Thanks again for your quick and helpful replies…!

Hi,
No, you can not use non-ZED cameras (or custom stream) as input in the ZED SDK.

To sum up, you have to use the ZED SDK on both ends (sender and receiver) or not at all, by using the Gstreamer plugin, for instance.

Stereolabs Support

Thanks again @BenjaminV !

Ok, I think I fully understand now.
Just to confirm finally: if I use the ZED SDK (e.g. with OpenCV, GStreamer, or the Unity plugin) on one side, it won’t be compatible with non-ZED libraries on the other side, correct?

For example, since the ZED SDK doesn’t support remote streaming directly as I understand, I would have to rely on GStreamer, OpenCV, or WebRTC without the ZED SDK. In that case, the client side cannot use the ZED SDK.
(Although the ZED SDK’s OpenCV and GStreamer APIs look quite similar to the originals, I’d like to confirm if they are truly not compatible.)

Also, is it possible to use non-ZED libraries directly with a ZED X camera?
For instance, in OpenCV with cv2.VideoCapture({ZED X camera}, ~) or in GStreamer with gst-launch-1.0 {zedxsrc}?

And just to be clear, the ZED SDK itself does not support remote streaming, correct?

Thanks again for your support :slight_smile:

You can use our GStreamer integration (https://www.stereolabs.com/docs/gstreamer), so you don’t need to use the SDK at all and it is compatible with the ZED X cameras.

And Yes, the SDK does not support remote streaming.

Stereolabs Support

1 Like