Low Frame Rate with ZED 2i at 60 FPS Due to Fusion and Inference Pipeline – Seeking Best Practices

Hi Team,

I’m working with the ZED 2i camera and have set the capture FPS to 60, but in practice, I’m observing a significantly reduced frame rate of around 3 FPS.

  • I’m using grab() followed by fusion operations (IMU, magnetometer, GPS) and running YOLO-based inference on each frame.
  • These downstream processing steps (fusion + inference) are taking longer than the camera’s frame interval, resulting in dropped frames.
  • I’m aware that grab() drops frames if not called in real-time, as per documentation.
  • The camera is mounted on a moving vehicle, so the low frame rate results in loss of critical spatial and temporal data, which affects downstream geolocation and asset detection accuracy.

My Questions:

  1. What are the best practices to avoid losing frames while performing heavy computation (fusion + inference)?
  2. Is there a recommended multi-threading or queue-based strategy to decouple grab() from post-processing?
  3. Can we access raw camera buffers asynchronously or cache them before processing to maintain real-time behavior?
  4. Would using an asynchronous pipeline with grab() in a producer thread and fusion/inference in consumer threads be viable with the SDK?

Any guidance on maintaining high frame rate while still running complex inference would be greatly appreciated.

Thanks!

Hi @karthikreddy157 , I’d like to ask how you were able to observe the framerate during runtime? Sorry that I cannot help out by the way

Hi @immanueln98
No worries at all, appreciate your interest!

I actually follow two approaches to monitor the frame rate during runtime:

  1. Logging timestamps in my main loop – I log the current second and count how many frames are processed per second.
  2. Enabling recording – I enabled the recording, and then use ZED Explorer to play back the .svo file and check the actual frame rate visually.