Synchronized Depth and Stereo RGB Topics/Images

I’m using multiple zed camera systems with gsml2. One utilizes two zed x mini cameras. Another uses a stereo pair of zed x one cameras (potentially more than two in the future). I’ve been referencing this thread and this thread.

For each of these systems, I’m trying to understand whether the depth and rgb images produced by the ZED SDK should be synchronized when they are emitted and how close those timestamps should be expected to be. I’m considering the following two scenarios:

  1. Depth and rgb image topics coming from the zed_wrapper (either live or via svo2 file replay)
  2. Depth and rgb images via the ZED_Depth_Viewer.

Regarding the ZED_Depth_Viewer, I notice that when I use two zed x one cameras configured in a stereo pair, and then view them in ZED_Depth_Viewer, the depth image says it publishes at one frame frame rate (e.g. 9-10hz) while the images publish at another (e.g. 15hz).

Ultimately, for the systems I’ve mentioned, should I expect depth and rgb images/topics to be synchronized in the zed_wrapper and the ZED_Depth_Viewer?
If so, how close should their timestamps be in the two cases above?

ZED Depth Viewer shows the grab rate (15 Hz) and the depth processing rate (9/10 Hz), but this does not mean that they are published at different rates, they are always synchronized and published at the depth rate.

Two frames to be synchronized must have a timestamp difference minor than the grab period = 1/grab_rate.