How does Positional Tracking handle losing location?

I’m currently using the ZED2i stereo camera with the ZED SDK (v 3.8, Python API). For my application I need to track the pose of the camera and for this I’m using the Positional Tracking feature. IMU fusion is currently enabled.

In some cases the camera is recording footage against an almost uniform background and gets stuck in the “SEARCHING” state for, let’s say, 10-20 frames at a time. I’m not able to share the footage, but you can consider it something like a long wall with a door.

In a situation like this, how is calculating the pose estimate carried out? I’m definitely getting pose updates. Is it that the IMU data is being using alone in this scenario where visual odometry fails or is it something else that is happening that can account for this?

I’m currently considering how to improve tracking performance so understanding this would be helpful (I’m not able to build a map before carrying out recording, simply due to the use case).

Hello,

When the SDk detects uncertainties in the positionnal tracking, the state becomes SEARCHING.
It still continues the tracking, but the new poses may be relative to a false pose.

Antoine

What frame is the pose updated with respect to, internally? Are new poses relative to the pose from one frame before?

Hello, I found that if an wall is close to the camera about 40cm, the position tracking state shows “SEARCHING” and I will lose the pose and translation information of the camera. What’s the problem about it and how can I fix it?

Hi, walls without texture are difficult to depth-detect for stereo cameras. The SLAM will drift more in such conditions. But can you give us a little more context ?
Please open a new ticket for this :slight_smile: