Sending the stream

hey there!
thanks a lot for your works.

Can you tell me please how to send the stream of zed2i to another system? what I want to do is like this:
I run the custom object detection on Xavier then want to send this visualizing data with bounding box to another laptop!

could you help me please!

Hello and thank you for reaching us out,

We provide a simple streaming sample which allows to do that :
The streamer can be used “as is”, it’s quite simple.

The reader is not different from any zed program, you open the camera just like it was plugged to your machine and everything is the same. If your laptop is not powerful and you just want to see the video stream, you can run zed explorer and open the stream with the IP address.


Antoine Lassagne
Senior Developer - ZED SDK
Stereolabs Support

bear me please!
my question is I am running the custom detector on my machine. at this point I want to send the view with the detection, not only stream with no detection!
I am running the custom detector and I can receive the stream but with no detection!!!

Is there any way to send string along with the stream?

Thank YOu

Hello again,

Unfortunately, we don’t have something fully packaged that can do what you want (yet :wink:)
For now, you’ll have to write the video streaming code yourself. Using opencv is probably an easy way to achieve this. There are tutorials for this on the internet, for example this one :

Stay tuned, we are preparing something that will suit your needs, and more.

Best regards

Antoine Lassagne
Senior Developer - ZED SDK
Stereolabs Support

Hello everyone,

I take advantage of this post about the stream to ask some questions.

Using the streaming example and multiple cameras on different ports, besides displaying the data, can the receiving pc handle the recording of each stream in SVO? (This would leave the senders still active and just enable the receiver to record when it’s needed)
Or should we save each svo on each jetson and then retrieve them?

Thank you,

Hello, you can do that. Their is even an option to disable the decoding on the receiver, so that it records immediately the frame as it receives it (see here)


I managed to make my installation, with 8 ZEDs, 8 Xavier jetsons, a switch and a pc master. Each jetson is autonomous with a docker container to launch the

I have a problem to synchronize the SVO after acquisition.
I record everything very well, but when I look at the timestamp of each svo, it corresponds to the clock of each JETSON and not of the master pc where the SVO recording command is done.

I checked the documentation and found this: Setting Up Multiple 3D Cameras | Stereolabs
I have tried PTP, I find the documentation not clear enough to set up.

Is there a manipulation to be done next to the code during the acquisition of the SVOs and to be able to point everything to the clock of the pc master or is the PTP the solution to be used imperatively?

Thanks in advance


Sorry for this (very) late response, I totally forgot about you. The image timestamp is indeed taken from the image source, the PC plugged to the camera itself.
Are you still trying to use PTP ? Where are you stuck ?


No worry for the delay . :slight_smile:

Yes I’m still trying to use PTP like in the tutorial . Currently, I have done a cron job (for when the jetson is rebooting , scripts will be always executed) on the jetson for these .sh :

  • sudo ptp4l -i <interface_name> -s -m by replacing <interface_name> to the correct value give, by nmcli device status
  • sudo phc2sys -m -s /dev/ptp0 -c CLOCK_REALTIME -O 0

On the side of master pc, I just ran sudo timedatectl set-ntp on`` `and sudo ptp4l -i <interface_name> -S -m``` then launching my receiver code .

I’m still facing with different timestamp (CURRENT and IMAGE).