I am building a seethrough VR project.
I need to show a different texture based on the eye the picture is rendered in.
The unity way of solving this seems to be using EyeIndex in the shader, however this doesn’t work, and the eye index is 0 on the right and left eye.
Hello Benjamin,
thank you for your answer. “EyeIndex” is already a prebuild node in Unity that accesses the StereoEyeIndex AFAIK, however I also already tried making a custom function outputting unity_StereoEyeIndex instead, to no avail. (I also tried using SinglePass and MultiPass)
If you can see the zed images in your headset with no problem, it can’t be an installation issue.
We are already using this variable to know which eye we need to render. The only difference with you is that we are doing it in a shader, not a shader graph.
Do you think the issue comes from our plugin? Did you try to not use our plugin at all and display a different texture on each eye using a simple camera from Unity?
Without the addon, in URP with just the basic camera and one plane + this very basic shader it works flawlessly. (OpenXR is active, if that makes a difference). I use the LTS version (2022) of Unity.
Then I tried opening the Planetarium scene (again, only OpenXR active + I did the URP conversion) so I could figure out if maybe my setup is wrong (because presumably the example scenes should be correct).
Now 2 things:
The Materials for the planets except the Sun don’t load, but that likely not relevant
This selector, that allows me to choose left/right/both eyes doesn’t do anything (in no scene of the ZED), this does work in a normal Scene with just one camera (when using a VR headset) tho.
Again, after adding a plane with the basic shader, it doesn’t work (i.e. eye index = 0).
In VR there is only an image plane/screen floating of what the camera sees instead of being “surrounded” by the View. But this is likely the way it is supposed to work?
I am guessing i am doing something wrong/missing a step to enable true VR? That is why I tried the example scene, but even that doesn’t seem to work. Sadly the guide on how to setup Mixed Reality with the Vive that is linked on the " AR Video Pass-through with Unity" guide doesn’t exist anymore (links straight to the homepage).
I’ll reproduce your setup on my side and see if I have the same issue.
It’s not really possible to completely fill the screen because the FoV of the ZED camera and the headset are different (Quest is bigger).
One workaround is to either increase the size of the plane where the image is displayed or decrease the Fov of the headset (it is explained here : https://github.com/stereolabs/zed-unity/issues/157).
However this will distort the image so we do not recommend it.
Well my workaround is going to be just hiding the plane from one of the cameras and having a different one for the other. Would still be nice to know if there is a proper way to achieve this with just eyeindex. Did it work for you?
Sorry for the very late reply, I was out of office for the last weeks.
With the current implementation, you can not have 2 planes without doing significant modifications of the plugin’s code.
I tested it using a simulated headset because I don’t have one right now and it seems to work as expected. However, it needs to be set to “single pass instanced”, not “multi pass”.