Trying to get mixed reality working with my old Zed Mini in Unity, using only the tracking from the HMD - no tracking needed from the ZED. I have the ZED attached to the front of my HMD.
In a brand new Unity scene using the built-in renderer, I have removed the default camera, added the XR input plugin, added the Oculus plugin, dropped in a ZED stereo rig, turned off the tracking option in the ZED Manager, and added a sphere in front of me.
When I run the project in the editor, the camera is detected and I can see the stereo scene in my HMD. If I reset the HMD tracking and look forward, I get a perfect AR scene. BUT the rendered quads in the HMD view seem to have the rotation of the HMD applied to them - they do not stay in the center of the headset view. If I tilt my head up, the quads tilt up and towards the top of the view along with the rotation of my head, leaving, for example, more black area at the bottom of the HMD view, and less and less black space on the top of the view. Likewise for the down, left and right rotations. I loaded the Planetarium example, and it does the exact same thing.
Anyone have an idea of what I’m doing wrong?