Stream Stereo Images Only to Standalone VR device built with Unity 3D

Hi there!

I would like to build an AR application based on a standalone VR device built on Unity 3D which I would only need the stereo images from the ZED Mini camera. I have the setup which the ZED Mini is connected to a cuda GPU device and I would like to retrieve the stereo images via local video streaming ( preferably http request).
This should be a straightforward setup, however, both the current ZED SDK and ZED Unity Plugin does not make it easy to get these stereo images and render them in Unity without depending on the .dll api call in the ZED Unity plugin, which is obviously not possible if we want to deploy on Android platform. I have been looking for a solution everywhere ( email ZED Support& post a github issue on ZED Unity Plugin), however, I have not found any straightforward and high performing solution. I would really appreciate it if anyone here could shed some light of how to approach this problem with the ZED Mini camera! Thank you very much!!


As you said, the Unity plugin can be installed on the Headset (nor the ZED SDK).
I think the best solution is to develop your own Unity app which takes the zed images as input (by stream) and display them in the Headset.

You can a look at the ZEDRenderingPlane.cs and ZEDMixedRealityPlugin.cs scripts in the Unity plugin to see how we did it.

Benjamin Vallon

1 Like

I took a look at the ZEDRenderingPlane.cs and ZEDMixedRealityPlugin.cs, these scripts only give information of how to update the shader and textures, they don’t tell us anything about how to retrieve the video stream in Unity… I looked at the ZEDCamera.cs and found that it only calls update via “.dll” connection to the ZED SDK…

I took a look at this repo for unity-gstreamer bridge, gst-unity-bridge/Unity at master · ua-i2cat/gst-unity-bridge · GitHub, but they only support Android (arm_v7), which could not be used for built for standalone device such as Quest2.

Anyone recommendation of how to proceed from here?