Detected objects change position

Hello!

I am working on a project on which a moving ZED 2 camera detects apples.

Using the Custom Detector sample I added a for cycle that prints the IDs and position (WORLD reference frame) of the detected apples.

I noticed that the application prints the same ID with different positions, even though the apples are static. What could this be due to?

Thanks!

Hi @MartinEsche
can you send a video showing this behavior?

Hi, thanks for your answer!

Maybe it’s easier if I show you the output:

0: 256x416 8 apples, 2212.3ms
Speed: 14.7ms preprocess, 2212.3ms inference, 9.5ms postprocess per image at shape (1, 3, 256, 416)
Results saved to runs/segment/predict7
104 [     1.0281     0.13292     -1.2169]
105 [      1.136    -0.18523     -1.2626]
106 [     1.2235     0.33665     -1.2672]
107 [     1.0185    -0.21692     -1.5278]
108 [     1.1469    -0.51164     -1.6873]

0: 256x416 4 apples, 2207.8ms
Speed: 7.2ms preprocess, 2207.8ms inference, 9.3ms postprocess per image at shape (1, 3, 256, 416)
Results saved to runs/segment/predict7
107 [    0.76053    -0.38538     -1.3505]
105 [    0.80755    -0.45912     -1.3753]
108 [     1.0143    -0.37134     -1.4274]
104 [     1.3301    -0.29328     -1.5555]

You can see that the same IDs have different possitions

Is it possible that you are printing the LABEL and not the unique ID?

A video is however useful to understand more

I think it should be the unique ID, I’m printing this for cycle:

for object in objects.object_list:
  print("{} {}".format(object.id, object.position))

Is this video useful?

Thanks again!

It’s not. Sincerely, I cannot understand the problem that you are facing.
If object tracking is enabled it’s expected that the same object has the same ID.

Are the IDs wrongly associated frame by frame?
Can you show me what’s happening? I cannot see it from the video you sent.

From what I can observe, every time the object_list is updated, IDs that were already listed change their corresponding position, so yes, it seems IDs are wrongly associated frame by frame.

In addition, the playback of the SVO lags and the number of detected apples seems way too low. This makes me think all the problems could be due to hardware limitations (I’m working on a Jetson Nano).

I wouldn’t know how to explain it better nor what else to show you, sorry…

Yes, AI processing on Jetson Nano is too slow and it can cause this kind of problem.
You could try to disable the real-time playback mode for the SVO to force the SDK to process each frame → svo_real_time_mode

Thank you very much!

It works much better.

I still have the issue with the IDs and the positions, but I will just blame it on the Jetson and ignore it.

Thanks again!