Hello everyone,
I am planning to build a small Virtual Production (VP) setup. I have heard that the ZED 2i can be used as a camera tracker, but I have a few concerns:
-
Drift on Green Screens: Does the ZED 2i experience noticeable drifting when used in front of a green screen? I know this was a major issue with the Intel RealSense T265.
-
Jitter and Synchronization: Since the ZED 2i lacks genlock, is there a reliable way to fix asynchronization between the video signal and tracking data? Are there options to add a delay value to the tracker to align them?
-
FreeD Protocol & Software Support: Does the ZED 2i support the FreeD protocol? I am looking to connect it to Assimilate Live FX in addition to Unreal Engine 5.
-
Simultaneous Tracking and Depth: Since the ZED 2i relies on CUDA, can a single sensor provide both camera tracking data and depth information simultaneously? My goal is to have video subjects pass both in front of and behind 3D objects in Unreal Engine (AR/VR workflow).