Stream Video to non-CUDA device

I’m using the ZED2 connected to a Jetson Nano on an UAV. All processing is done on the Jetson Nano because it should work without any ground station.
For debugging, I want to view the image on my laptop. Unfortunately my laptop does not have a NVidia GPU so I can’t use the Local Streaming feature.
This post also explains that the local stream has a proprietary format so opening it in VLC or another application won’t work.
Of course I can retrieve the images manually, encode and send them somehow, but if the ZED SDK encodes the images for recording anyway, can’t it also stream them in a format that a web browser or VLC understands? I already have a solution for the sensor data so I only need the video.
I already searcht google and this forum for similar questions, but I didn’t find anything. That surprises me a bit because I think my use case isn’t that exotic.

EDIT: The jetson and the laptop are connected by WiFi.


You might be interested in the following repository :

Especially, it shows how to create a RTSP server with gstreamer to stream the left video (or right or something else) and you can then access it with VLC for example.

/* OB */

Hi, thanks for your quick reply. Can I use the GStreamer source plugin and the C++ API at the same time?

Hi there! I have the same question. I would like to stream only the video from left& right camera and request via http. But I couldn’t find the camera url of ZED mini… The stereo images will be received and processed by a stand alone VR device built with Unity 3D. This should be possible as I would only like to have the AR environment without any AR post processing. But the current ZED SDK and ZED Unity plugin makes it very difficult to retrieve those stereo images and used it somewhere lese like Unity.

I asked two months ago at Zed Unity Plugin github issue but didn’t get a response. Hope that someone here could shed some light on this issue.

Yes, this is what is done here :

Gstreamer and ZED SDK are used in the same time

1 Like

@keli95566 What you need to do is to create your side-by-side image (Left + Right) and then send it on the network in RTSP or something else (For latency, RTSP is better than HTTP )

Same, zed-gstreamer repository can generate this stream. Look at the repo here :

@obraun-sl Thanks again for your reply. This code looks exactly like what I want. I will try to integrate it into my project when I have time.

1 Like

Thank you! I will test it these days and see if it works,

Hi there, an update. I managed to stream ZED data to a standalone VR device running on Armv7 and OpenGLES3, however, I saw huge jittering artifact that randomly occurs, as the gif shows.


I don’t have a computer graphics background, and couldn’t understand why this occurs. This artifact only occurs when running on the VR Android device, not on Unity Windows Editor. Is it because of the video decoding is somehow wrong, or is it because of the VR headset hardware? Would really appreciate if anyone could kindly help!!