Streaming with a ZED 2


I wanted to measure the streaming delay of my ZED 2 from a Jetson Nano to a Windows 10 Desktop (GPU: GTX 2080S, CPU: i7-9700K, RAM: 16GB).

I made sure to follow the procedure to install the ZED SDK and the components needed on both the Jetson Nano and the Windows 10 desktop.

The two devices were connected by a Gigabit switch and the desktop was equipped with a gigabit ethernet adapter.

However, after building and running the sample for camera streaming (given on the StereoLabs GitHub) I realized that my delay was quite important in this setup (>300ms). Moreover, I was only able to obtain results when I streamed @15fps any frame rate higher than this would lead to the program to quit/crash after receiving a few images (that had a delay >10s).

At this point, I thought that maybe the Jetson Nano was not enough powerful to perform this operation with the settings given in the sample code so I tried a different set. I changed the encoder, bitrate, video quality, etc. but it still has delays greater than 300ms.

I switched the Jetson Nano for another desktop with the same setup as the one I used earlier (Two Windows 10 Desktop with GTX 2080S, etc.) and I could only gain 100ms when streaming @15 fps but I still encountered the same issues with higher frame rate.

Thank you for your help.

I once noticed with ZED Explorer that the framerate was switched back from 60fps to 15fps once I had issues with my USB-extension, somehow this fell back to USB2.
And when receiving streams from JetsonNano using the python examples, I always need to have an idle ZED Explorer opened in the background, otherwise I observe a severe number of framedrops. The difference with and without an idle ZED Explorer can also be seen when looking at the GPU load in the Win10 TaskManager. No idea what it does in the background.
Other than that, I can say that I don’t have any framerate issues when streaming from JetsonNano with the given streaming_sender example H.264@60fps (however, my receivers crash when using H.265, see my other post).
I never had a look at the delay though. I just record and then offline I work with the epoch image timestamps which I synced across devices via ptpd. With these timestamps, I don’t see any delay compared to the other inputs which I record (some CAN bus data). I only account the ‘age’ of the images of 2-3 frametimes as stated in the SDK documentation (i.e. 33ms@60fps).

Hey Alex

Did you ever resolve this? I have created a thread on what might be the same issue here: