I’m using the ZED2 connected to a Jetson Nano on an UAV. All processing is done on the Jetson Nano because it should work without any ground station.
For debugging, I want to view the image on my laptop. Unfortunately my laptop does not have a NVidia GPU so I can’t use the Local Streaming feature. This post also explains that the local stream has a proprietary format so opening it in VLC or another application won’t work.
Of course I can retrieve the images manually, encode and send them somehow, but if the ZED SDK encodes the images for recording anyway, can’t it also stream them in a format that a web browser or VLC understands? I already have a solution for the sensor data so I only need the video.
I already searcht google and this forum for similar questions, but I didn’t find anything. That surprises me a bit because I think my use case isn’t that exotic.
EDIT: The jetson and the laptop are connected by WiFi.
You might be interested in the following repository :
Especially, it shows how to create a RTSP server with gstreamer to stream the left video (or right or something else) and you can then access it with VLC for example.
Hi there! I have the same question. I would like to stream only the video from left& right camera and request via http. But I couldn’t find the camera url of ZED mini… The stereo images will be received and processed by a stand alone VR device built with Unity 3D. This should be possible as I would only like to have the AR environment without any AR post processing. But the current ZED SDK and ZED Unity plugin makes it very difficult to retrieve those stereo images and used it somewhere lese like Unity.
I asked two months ago at Zed Unity Plugin github issue but didn’t get a response. Hope that someone here could shed some light on this issue.
@keli95566 What you need to do is to create your side-by-side image (Left + Right) and then send it on the network in RTSP or something else (For latency, RTSP is better than HTTP )
Same, zed-gstreamer repository can generate this stream. Look at the repo here :
Hi there, an update. I managed to stream ZED data to a standalone VR device running on Armv7 and OpenGLES3, however, I saw huge jittering artifact that randomly occurs, as the gif shows.
I don’t have a computer graphics background, and couldn’t understand why this occurs. This artifact only occurs when running on the VR Android device, not on Unity Windows Editor. Is it because of the video decoding is somehow wrong, or is it because of the VR headset hardware? Would really appreciate if anyone could kindly help!!
I am able to get gstreamer to work from the command line using the various examples, but when it is running I can not open the zed camera in my code and control depth mode to do spatial mapping using zed SDK.
Does anyone have instructions or a code sample of how to open the zed camera using zed sdk, and then get gstreamer RTP stream of a single camera feed (left or right). Maybe put another way, since I am already calling grab() and retrieveImage() in my code, can I push this to a gstreamer pipeline somehow?
Hi @grahambriggs
it is not possible to call the open function for the same camera from two different processes.
The only possible solution is that one of the processes starts a ZED stream and the other connects to that stream.
As you know, streaming requires no jittering and low latency.
So, since I am a new to streaming (I am a robotics engineer ),
and It seems like you are using standalone VR, would it be better for me to decode videos on a PC or laptops with a GPU and then mirror them to VR for low latency?