Zed2i body tracking + Oculus Quest 3 (UE5)

The Body Tracking example scene for Unreal Engine 5 is working as expected, but when I try to inject VR functionality to the ZedPawn, it just doesn’t work, leaving the application in a non-responsive state.
Is there an example to show how Zed2i can interact with VR headsets? Can VR code be added to the ZedPawn so the player can see the 3D avatar replicating the player’s movements from within the game?

To clarify, this is what I’m trying to do:

  • Multiplayer application in Unreal Engine 5 with 2+ PCs connected via LAN.
  • Player 1 hosts the session and launches the body tracking example. This means there is a small window displaying what the zed2i camera is capturing, as well as a 3D avatar (e.g. Manny) mimicking Player 1’s movement. In addition, Player 1 wears a VR headset (Oculus Quest 3) so they have an in-game view of the 3D avatar’s movement and the virtual scene.
  • Player 2 joins the session (no VR headset) and sees Player 1’s 3D avatar and the movements from the body tracking capture.

I’d appreciate any tips or examples on how to get this working, I haven’t had any luck so far.


I’m sorry but our UE5 plugin is not compatible with VR headsets. You might need to directly modify the plugin to add this feature.

However, I’m not sure it is a problem for you. Do you need to display the ZED Images inside the headset?

If not, I think you don’t need to use the ZedPawn to be compatible with a VR headset. The ZEDPawn (used to “simulate” the real ZED Camera) and the camera are independent.

In your scene, you should have the ZED Pawn, that will run the ZED SDK, compute the skeleton tracking, etc and also a “VR pawn” that will be the player 1 point of view.

Let me know If I missed something.

Best regards,

R&D Engineer
Stereolabs Support