Open LennartJohansen opened 8 months ago
Looks like the passthrough functionality is not available when using CompositorLayers and Metal shaders.
You have to use RealityView to get a proper mixed mode with passthrough.
I guess the color calculations done in the fragment shader could output to a new texture in memory. Then this could be used on the plane rendered with RealityView using an unlit shader. And in this mode we could discard pixels.
Could this help with the 1.5 meter area for immersive applications. With the unlit shader the output using RealityView would not be affected by room lighting. Is the 1.5 meter enforced in mixed applications like this?
RealityView isn't ~super viable unless we can get precise vsync timing, is the thing. We have to know the exact head pose at every vsync, or we won't be able to correctly timewarp (the world will look sloshy)
Since some time has passed, I wanted to ask again if anything has changed? Mixed Reality with Flightsimulator would be fantastic.
it's fully implemented client-side, however on visionOS 1.x you'll have to use the Experimental Renderer. Also, unfortunately, Apple broke backwards compat on Xcode beta 16 (needed for visionOS 2 features), so the testflights are a mess atm.
Ahh, okay, thx for reply!
Would it be hard to add a chroma key passthrough to the client application?
I tried to change the immersion mode of the application from full to mixed. Setting alpha in the fragment shader had no effect and discarding pixels only makes the output black.
Anyone know if there is some additional settings needed to get passthrough working. Settings on the composition layers, depth or anything else?
It would be nice with an option to set a color and tolerance that we could process in the fragment shader to set a pixel transparent.