Zed2i as a camera tracker for Virtual production

Hello everyone,

I am planning to build a small Virtual Production (VP) setup. I have heard that the ZED 2i can be used as a camera tracker, but I have a few concerns:

  1. Drift on Green Screens: Does the ZED 2i experience noticeable drifting when used in front of a green screen? I know this was a major issue with the Intel RealSense T265.

  2. Jitter and Synchronization: Since the ZED 2i lacks genlock, is there a reliable way to fix asynchronization between the video signal and tracking data? Are there options to add a delay value to the tracker to align them?

  3. FreeD Protocol & Software Support: Does the ZED 2i support the FreeD protocol? I am looking to connect it to Assimilate Live FX in addition to Unreal Engine 5.

  4. Simultaneous Tracking and Depth: Since the ZED 2i relies on CUDA, can a single sensor provide both camera tracking data and depth information simultaneously? My goal is to have video subjects pass both in front of and behind 3D objects in Unreal Engine (AR/VR workflow).

Hi @psychoanima
Welcome to the Stereolabs community.

A pure green screen, with no references in front of it, misses the visual information to be used to track the position of the camera. The inertial sensors installed in the camera are used in this case, but only for short periods; otherwise, drifts are expected.

The ZED SDK provides a precise timestamp reference to be used for synchronization.

This is not supported, but you can use the ZED SDK API to support it.

Yes, you can retrieve both types of information simultaneously.