I am doing motion based extrinsic calibration between zed2 and vlp16. So I need to get odometry data from zed2 and raw pointcloud from vlp16 (I do lidar odometry offline). What is the best way synchronize pointclous and zed2 odometry. I am currently using message filters from ros2. But I am thinking that how does zed2 decide its initial position of odometry frame. Is it the place where I start my zed2 node? And another question is there some trigger that would allow instantiate initial odometry frame at desired time?
I suggest you perform the laser/camera calibration outside the ROS2 framework.
The ZED ROS2 Wrapper internally performs data conversion to provide the odometry data in the
base_link frame, and to do so it uses the information provided by the URDF file.
Regarding the origin of the odometry system, it is the point where the positional tracking node is started, unless you provide a prior pose as initialization:
Ok, thanks for the information. Regarding the timing, when I dropped depth quality from NEURAL to QUALITY my frame rate increased by 1-2hz. Is it so that with setting NEURAL zed2 cannot keep with desired frame rate?
What kind of host device are you using? GPU card model?
Intel i7-10750H CPU @ 2.60GHz × 12
NVIDIA GeForce RTX 2060
The NEURAL depth mode uses more GPU resources than the QUALITY depth mode, but NEURAL is more precise and detailed.