Hi,
The Body Tracking example scene for Unreal Engine 5 is working as expected, but when I try to inject VR functionality to the ZedPawn, it just doesn’t work, leaving the application in a non-responsive state.
Is there an example to show how Zed2i can interact with VR headsets? Can VR code be added to the ZedPawn so the player can see the 3D avatar replicating the player’s movements from within the game?
To clarify, this is what I’m trying to do:
- Multiplayer application in Unreal Engine 5 with 2+ PCs connected via LAN.
- Player 1 hosts the session and launches the body tracking example. This means there is a small window displaying what the zed2i camera is capturing, as well as a 3D avatar (e.g. Manny) mimicking Player 1’s movement. In addition, Player 1 wears a VR headset (Oculus Quest 3) so they have an in-game view of the 3D avatar’s movement and the virtual scene.
- Player 2 joins the session (no VR headset) and sees Player 1’s 3D avatar and the movements from the body tracking capture.
I’d appreciate any tips or examples on how to get this working, I haven’t had any luck so far.
Thanks!