I am planning to build a small Virtual Production (VP) setup. I have heard that the ZED 2i can be used as a camera tracker, but I have a few concerns:
Drift on Green Screens: Does the ZED 2i experience noticeable drifting when used in front of a green screen? I know this was a major issue with the Intel RealSense T265.
Jitter and Synchronization: Since the ZED 2i lacks genlock, is there a reliable way to fix asynchronization between the video signal and tracking data? Are there options to add a delay value to the tracker to align them?
FreeD Protocol & Software Support: Does the ZED 2i support the FreeD protocol? I am looking to connect it to Assimilate Live FX in addition to Unreal Engine 5.
Simultaneous Tracking and Depth: Since the ZED 2i relies on CUDA, can a single sensor provide both camera tracking data and depth information simultaneously? My goal is to have video subjects pass both in front of and behind 3D objects in Unreal Engine (AR/VR workflow).
A pure green screen, with no references in front of it, misses the visual information to be used to track the position of the camera. The inertial sensors installed in the camera are used in this case, but only for short periods; otherwise, drifts are expected.
The ZED SDK provides a precise timestamp reference to be used for synchronization.
This is not supported, but you can use the ZED SDK API to support it.
Yes, you can retrieve both types of information simultaneously.
Thanks for your reply! Regarding my first question, the device is working like any other V-SLAM device. Since there is a Live Link plugin for Unreal Engine, I have a few additional questions:
For pure camera tracking, are there options to offset the ZED 2i to match the real video camera and prevent āfloatingā effects?
I noticed the plugin for UE 5.7 is not yet on GitHub. Is there a confirmed release date for it?