Improving fps on Jetson Orin HW with zed_ros2_wrapper

Hi, I’m currently developing on new Nvidia Jetson Orin Hardware i.e. Nano8, NX16 and AGX64 with dual cameras zed2i.
The setup is: Jetson Linux 35.3.1 therefore JetPack 5.1.1
(JetPack SDK 5.1.1 | NVIDIA Developer), ZED SDK 4.0 and ROS2 foxy with zed_ros2_wrapper (all locally installed).
I would like to have a steady 25 - 30 fps on both color and depth for the 2 cameras with higher resolution.

Currently reaching ~20 fps on depth and ~10 fps on color (NX16) with:

  • grab_resolution: VGA

  • grab_frame_rate: 60

  • pub_resolution: VGA

  • pub_frame_rate: 60

  • depth_mode: PERFORMANCE

  • openni_depth_mode: true

  • pos_tracking: disabled

On AGX64 I can reach the same performance with MEDIUM/HD720 but on Nano8 even VGA is quite slower. I’m most interested on making it work on Nano8.

Finally, is there any tested setup to reach the aforementioned performance, that is, versions of Jetson Linux/ZED SDK/ROS2 or set of camera configurations?

Thank you in advance!

Hi @fonticode
Welcome to the Stereolabs community.

Have you tried to lower the grab_frame_rate to 30 if your requirement is 23-30 FPS?

Thank you @Myzhar.
Yes I did, the performance is pretty much the same

How are you measuring the FPS?
Are you running rviz2 on the Jetson?

no I’m using rqt, ROS CLI or foxglove (with rosbridge)

rqt and rviz2 can take too much computing power and reduce the final performances, mainly on a Nano.

Here are the expected FPS on the Orin Nano when using the ZED Depth Viewer:

ZED Depth Viewer
HD720@60

  • PERFORMANCE: 60 Hz
  • QUALITY: 20 Hz
  • ULTRA: 28 Hz
  • NEURAL: 15 Hz

HD1080@30

  • PERFORMANCE: 30 Hz
  • QUALITY: 13 Hz
  • ULTRA: 25 Hz
  • NEURAL: 14 Hz

With the ZED ROS2 Wrapper you can expect 2-3 FPS lower values caused by the ROS 2 middleware.

Makes sense, by bag recording the topics and playing them back later I can visualize them at the “actual” frequency, with my previous settings, I’ve got:

  • depth_registered ~30Hz
  • image_raw_color ~20Hz

Hi @Myzhar,
After some further experiments I observed the following behaviors:

When trying to launch both cameras with VGA,30,ULTRA/PERFORMANCE on Orin NX16 there is an imbalance between the two cameras, one publishes depth at 30Hz the other at 15Hz.

If I try to set the resolution to 720,30,ULTRA, the first camera is initialized properly and runs, the second prints the following error message:

[ZED][ERROR] [ZED] Cannot initialize the camera. Try another resolution

and does not recover, reaching detection timeout error (same with MEDIUM/AUTO).

Moreover by launching a single camera with the ros2 wrapper launcher I observed the following frequencies:
ROS2 (foxy) topic hz, launching zed2i.launch.py
HD720@60

  • PERFORMANCE: 27 Hz
  • QUALITY: 21 Hz
  • ULTRA: 20 Hz
  • NEURAL: 13 Hz

HD1080@30

  • PERFORMANCE: 13 Hz
  • QUALITY: 11 Hz
  • ULTRA: 11 Hz
  • NEURAL: 9 Hz

Bottom line:

  • It seems that the only resolution to use 2 cameras at the same time is VGA (with imbalance)
  • The difference between the expected and achieved frequencies is quite large

Have you by any chance done similar experiments? Should I try using the previous version of Jetson Linux with SDK 3.8 instead of 4.0? or maybe ROS Humble from a container?

We are investigating this in the next few days.
We have scheduled a massive Jetson benchmarking section to have a better idea of the expected performances we will publish a blog post with the results.

@Myzhar Hello, do you have an ETA on this blog post? Thanks!

3 Likes

@Myzhar Any updates or news on this?

frankly speaking we are happy with the HW of the ZED 2i cams for prosessional and production use but the SW implementation and drivers on ROS2 have extremely poor performance if for example compared with Intel Realsense SDK.

Even on the most recent HW like the Nvidia Orin Series (like Jetson Orin NX 16) we struggle to run 2 cameras at 25 fps JUST computing the stereo Depth using the PERFORMANCE setting under the ROS2 wrapper.

Would be appreciated that you invest on that “low level” tuning rather than adding models to the framework which are nice for educational purposes but don’t help the adoption for production and industrialized products.

Thanks,
MP

2 Likes