I am working on a project for some virtual production applications. I have a user in a digital world viewing it through a Meta Quest 3. That is working fine.
I have a ZED 2i camera showing a view of the digital world from its perspective. I can synchronize the worlds by scanning placing a QR Code behind the ZED camera using the Quest 3. This recenters the world so the Quest knows where the ZED camera is in digital and physical space and aligns everything nicely enough for my use case.
I have two pieces i’m struggling with:
- I need the Quest 3 to have its own rendering to see the digital world that the player can move through synchronized to the ZED camera view world. That is working fine by setting the Center Eye Camera of the Quest 3 to DISPLAY 1 and BOTH EYES.
I need the ZED camera to display what it sees with AR passthrough so the user is superimposed onto the digital world. I cannot seem to figure out how to display both the headset view and the ZED camera view to different displays in a PCVR build. Has anyone succeeded with this use case? - The AR View from the ZED camera is successfully showing the user and some of the digital world. BUT its rendering a passthrough image over some of the elements of the digital world that should be occluding. Not sure what is going on with that or where to look for solving.
Open to any suggestions or clarifying any of this.