ZED 2i parameter tuning questions

Hi, it’s me again :wave:

I’m still working on integrating the ZED 2i with my visual-inertial SLAM pipeline, and I only subscribe to:

  • /zed2i/zed_node/left/image_raw or /zed2i/zed_node/left/image_raw/compressed
  • /zed2i/zed_node/imu/data_raw and/or /imu/data

I’ve carefully reviewed the documentation, previous answers, and some GitHub issues — but I still have a few uncertainties about parameters in common.yaml, zed2i.yaml, and sync.yaml.


1. About pub_frame_rate and grab_frame_rate

What is the recommended relationship between these two?

  • When should pub_frame_rate < grab_frame_rate?
  • Can they be equal (e.g. both 30)? Would that risk frame drops due to timing jitter or CPU/GPU load?

2. About /zed2i/zed_node/imu/data

According to the official docs:

imu/data: Accelerometer, gyroscope, and orientation data in Earth frame

Does this mean that this sensor data /imu/data provides orientation computed by internal sensor fusion (VRU instead of AHRS)?


3. About sensors_timestamp_sync in common.yaml

According to earlier issue #538, if I set:
pub_frame_rate: 30, sensors_timestamp_sync: true, max_pub_rate: 200.
Will this result in IMU messages being published at ~30Hz (roughly matching camera rate), and is it therefore not recommended to set sensors_timestamp_sync to true?
Or will it still publish at 200Hz unless I subscribe to an image topic?


4. About sync.yaml

This file seems related to RgbdSensorsSyncNodelet, but I’m only using the raw left camera and IMU. So:

  • Can I ignore this file altogether in my case?
  • If not, what does approx_sync: false do in a system with 30Hz images and 200Hz IMU?
  • What exactly does queue_size control?
  • What do sub_imu and sub_mag enable — is this for internal fusion, TF matching, or bag alignment?

5. Kalibr extrinsics and 3D visualization

I plan to calibrate the camera-IMU extrinsics using Kalibr.

  • The frame_id of /zed2i/zed_node/left_raw/image_raw_gray is zed2i_left_camera_optical_frame
  • The frame_id of /zed2i/zed_node/imu/data is zed2i_imu_link

Can I confirm that:

  • zed2i_left_camera_optical_frame follows OpenCV’s RDF (right-down-forward) convention?
  • zed2i_imu_link follows ROS’s FLU (forward-left-up) convention?

I want to visualize the result in 3D after calibration and ensure the transforms align with the physical layout, similar to what I’ve done before with another device using Kalibr.


6. ZED 2i sensor readout time

What is the readout time for the ZED 2i sensor?


Thanks again for your clarification!

When you do not need subscriber nodes to receive data at full grab rate, but you want to keep grab rate higher for internal ZED Node processing (i.e. Positional Tracking)

Yes, but this could require more computing power. It’s up to you to tune the parameters according to your requirements.

Accelerations and Angular Velocities come from the IMU sensor. The ZED SDK provides orientation from positional tracking

IMU data is internally retrieved at the full rate, but published in sync with the grabbed frames. This can be required for specific processing where you only need synchronized IMU data.

Are you still using the ROS node? We recommend to move to ROS 2. ROS 1 will be EOL at the end of May 2025 and the ZED ROS Wrapper is obsolete now.

Yes

See message_filter parameters for more information.

Yes

Yes, see REP105

What do you mean?

Thanks again for your previous clarifications.

I’d like to follow up with two more questions related to /imu/data.orientation and rolling shutter timing.


1. Regarding the pos_tracking_enabled parameter in common.yaml:

pos_tracking:
  pos_tracking_enabled: false

According to the Positional Tracking Overview,it seems that the SDK uses visual-inertial fusion to compute 6DoF pose, including orientation. However, even when I disable pos_tracking_enabled, I still see valid orientation values in /zed2i/zed_node/imu/data(max_pub_rate = 200Hz).

So I’m trying to confirm:

  • When pos_tracking_enabled is set to false, is the orientation in /imu/data still computed from the visual tracking module?
  • Or does it fall back to a pure IMU-only fusion (i.e., based only on accelerometer + gyroscope, like a VRU filter)?
  • In other words, is /imu/data.orientation ever usable as a standalone inertial information estimation, or is it always tied to visual tracking?
  • If this is the case, doesn’t merging inertial information with visual tracking to produce /imu/data.orientation seem somewhat counterintuitive—even redundant?
    It feels like the VIO system is feeding back into the IMU topic.

2. About the readout time of the ZED 2i sensor

I could only find the readout time of the ZED 2 camera.

Could you confirm the rolling shutter readout time for the ZED 2i sensor?


Thanks again for your support — really appreciate the technical clarity.

No, it only uses inertial information.

Exactly

It’s always a valid information

Sensor fusion is normally performed to improve the final result, so it’s not redundant.

What do you mean by rolling shutter readout time?

Thank you for taking the time to reply—much appreciated.

1 Like