Streaming over local network

I have a few questions and would appreciate detailed assistance with each:

  1. Streaming Setup: I have two ZED 2i cameras connected to a Jetson Orin AGX. If I want to stream video from both of the cameras to my laptop at HD720 resolution, what is the maximum frame rate I can achieve for each camera? Is it 30 FPS or 60 FPS? Additionally, will the frame rate remain consistent if I stream from both cameras simultaneously?

  2. Saving Depth Maps: I plan to save depth maps on a second Jetson device on the same local network. Will the number of frames saved be the same as the number of frames displayed, or can I specify a different rate for saving depth maps? If it’s possible to save depth maps at a different rate, please provide guidance on how to achieve this.

  3. FPS Reporting: When using zed.get_current_fps, I receive results in fractions of a second rather than whole seconds. For example, at 30 FPS, the value should represent 1 second, but I get updates more frequently. Can you explain why this happens and how to interpret these results?

  4. Depth Map Saving: Is there a function in the ZED SDK to determine the number of depth maps saved per second? Also, what is the best file format for saving depth maps?

  5. Can you recommend best BITRATE for each camera please ?

Thank you in advance for your help.

Hi @ashfaq,

  1. In HD720 you can reach up to 60FPS. The consistency of the connection through the network depends entirely on your network configuration, whether you are using a cabled or wireless connection, etc.
  2. You can develop an application that saves the frames at any given rate of your choosing, you simply have to control the rate of saving the images in the grab loop in any of our samples.
  3. the zed.get_current_fps method returns the framerate at which the grab method is called in your application. This method returns a value in float which is constantly measured in every grab() call.
  4. We do not provide such a method, as this is user behavior specific to your application. The PFM or PGM formats save float values which are most appropriate for depth data as real values.
  5. You can find recommended bitrates depending on resolution and fps here: Local Video Streaming - Stereolabs

Thanks, Mattrouss,

  1. I observed that the camera supports 60 fps for HD720 resolution, but when using two cameras, the frame rate drops to 30 fps. According to your documentation, the following configurations were tested using a single USB 3.0 controller:
    2 ZEDs in HD1080 @ 15fps and HD720 @ 30fps
    3 ZEDs in HD720 @ 15fps
    4 ZEDs in VGA @ 30fps
    Based on this information, it appears that when using the USB ports on the Jetson Orin AGX, the maximum achievable frame rate is 30 fps. Is this correct?

  2. I am having trouble understanding the instruction, “you simply have to control the rate of saving the images in the grab loop in any of our samples.” Could you provide an example of this with a sample of your choice? Additionally, is it possible to achieve this while setting fps=30 on a device connected to the camera and starting the stream?

Please assist me with accurately information, and answer my concern in details.

Hi @ashfaq,

  1. I can confirm that plugging two USB cameras on the same USB controller will result in drop frames in HD720@60, so it is recommended to set the frame rate to 30FPS.
  2. In the receiver sample here: zed-sdk/camera streaming/receiver/cpp at master · stereolabs/zed-sdk · GitHub, in the while loop you can save the depth images at any interval of time you wish (you can add sleep() methods to adjust the rate). This can work with the sender camera at 30 fps, as both programs (sender and receiver) will be independent