Closed atteneder closed 6 months ago
Manually convert to URP at runtime or editor script?
Manually convert to URP at runtime or editor script?
I don't get what you're asking. Can you elaborate?
Update regarding shaders: The glTF shader graphs have a couple of incompatible elements in them. One solution could be to create a separate, visionOS compatible set of shader graphs.
VisionOS seems to support URP and Unlit shaders. So I wonder about converting?
VisionOS seems to support URP and Unlit shaders. So I wonder about converting?
I don't understand that question. Can you be more specific? Convert what to what?
Hi Andreas,
Thanks for looking into this. We are really interested about this capability/platform support.
A couple of month ago we also tried and I was able to get that partially working on the simulator: https://discussions.unity.com/t/gltfast-shaders-in-polyspatial/280868/4 What I did at the time was to fully rebuild the shader graph using the similar version of the URP package available in the sample. It seems there was some incompatibility issue with some of the graph node/your specific package (and there was no real "urp shader graph upgrade" for some of the nodes). It resolved the black issue but the rendering was still sub-part (material not matching, transparency/depth issue, etc.).
FYI: As it may seem counterintuitive but for us getting the draco package support on visionOS is the most important obstacle as it is difficult to turn that off from our prod or test pipeline, and it will avoid us to duplicate effort to try to rebuild draco from source with potentially different build settings as the one you are using.
I made a Visual Scripting node library for glTFast two months ago, I also realized that some of the features didn’t work on visionOS as mentioned in this issue.
I found that Sharders in UnityGLTF work on visionOS and made a material generator to apply the shaders to imported models at runtime.
With a few modifications, glTFast worked on visionOS. The rendering pipeline is URP.
Changed the textures to always be readable by the CPU.
Modified Emission to not use glTFast_HDRP_GetEmissionHDRColor and RPSwitch.
Changed BaseColor to not use ProjectColorSpace.
Remove Emission from the Fragment node in glTF-unlit.
You can try it by adding the following line to the manifest.json and importing it:
"com.atteneder.gltfast": "https://github.com/HoloLabInc/glTFast.git#polyspatial-urp",
KTX' first step towards visionOS is coming along well.
n.b. Unity's GLTFast works out of the box - just add it by name to package manager
https://docs.unity3d.com/Packages/com.unity.cloud.gltfast@5.2/manual/index.html
Actually is vertex color supported for runtime glb import?
With a few modifications, glTFast worked on visionOS. The rendering pipeline is URP.
You're my savior @tarukosu
@atteneder Does glTFast work as-is for fully immersive/VR apps? Or do these issues need to be fixed across the board (windowed, immersive/MR, and fully immersive/VR) for full functionality?
@atteneder Does glTFast work as-is for fully immersive/VR apps? Or do these issues need to be fixed across the board (windowed, immersive/MR, and fully immersive/VR) for full functionality?
Can confirm that glTFast works properly for Fully Immersive Apps.
Actually is vertex color supported for runtime glb import?
It should be, with glTFast's shader graphs.
@atteneder Does glTFast work as-is for fully immersive/VR apps? Or do these issues need to be fixed across the board (windowed, immersive/MR, and fully immersive/VR) for full functionality?
To quote the PolySpatial visionOS docs:
For technical, security, and privacy reasons, visionOS does not allow Metal-based shaders or other low level shading languages to run when using AR passthrough.
That leaves me to assume they're working in windowed and fully immersive mode already.
I'm currenlty working on getting mixed reality cases (conversion to MaterialX) working as good as possible as well.
hth
Unity glTFast 6.4.0 was released with initial visionOS support!
OpenUPM release will follow later today.
The newest versions of KTX for Unity and Draco for Unity support visionOS as well.
Emission on materials is not working, that's a known issue on PolySpatial's side that'll be fixed soon. As a workaround you can change the mode on the emissiveFactor color to default (instead of HDR).
Thanks for your feedback!
Thanks @atteneder!
I just tried out 6.4.0 and I'm still getting black textures in the editor's simulation and on device. Using two of the sample assets from here https://github.com/KhronosGroup/glTF-Sample-Assets/blob/main/Models/, trying one imported vs one downloaded at runtime, this is what I'm seeing:
With warnings in the console of [Diagnostics] Warning: Non shader graph shader 'glTF/PbrMetallicRoughness' not supported or MaterialX encoding missing
If I create a new material, set its Shader to Shader Graphs/glTF - pbrMetallicRoughness
, and then use that on the imported model, it does work, in editor simulation and on device. I don't think this was working before so that's good! But I don't think that's feasible for runtime-loaded assets.
Is there a step I'm missing?
@lourd I think you're using the built-in render pipeline and I've only tested the Universal Render Pipeline.
Could you try if setting the scripting define GLTFAST_BUILTIN_SHADER_GRAPH
in your player settings works?
This changes glTFast to use the shader graphs with Built-in as well and might solve your issue without the need to switch to URP completely.
Otherwise just use URP. The actual rendering on the device (or in the Simulator) will be done by visionOS regardless.
hth
btw: here's the documentation regarding shader graphs in built-in.
I think you're using the built-in render pipeline
Thought I was using URP, just realized I created the Render Pipeline Asset but never assigned it in settings 🤦
Setting the GLTFAST_BUILTIN_SHADER_GRAPH
scripting define also fixed the issues for the built in render pipeline, I tested that out separately.
Thanks @atteneder!
Alpha clipping is not working on vision os. is it just for me?
Actually, it works on Editor and play mode.
Additionally, sometimes works in simulator when i change the shader graph(pbrMetalRoughness)'s alpha clipping = true. but this method is not permant.
even if i make alpha clipping to true in material(using copy preset), it is not works.
Unfortunately, the current Unity PolySpatial does not support Material Override. Therefore, even if Alpha Clipping or Transparent is set in the material, the default values in Shader Graph will be used.
As a workaround, you can duplicate the Shader Graph and create versions with Alpha Clipping enabled or Transparent enabled, and use the appropriate shaders.
This workaround is implemented in the following branch.
"com.atteneder.gltfast": "https://github.com/HoloLabInc/glTFast.git#polyspatial-urp-transparent",
Alpha clipping is not working on vision os. is it just for me?
Actually, it works on Editor and play mode.
Additionally, sometimes works in simulator when i change the shader graph(pbrMetalRoughness)'s alpha clipping = true. but this method is not permant.
even if i make alpha clipping to true in material(using copy preset), it is not works.
I can confirm that, unfortunately.
glTFast changes a material's surface type from opaque to transparent at runtime, which does not work with Polyspatial, since it converts shader graphs to MaterialX upfront.
The solution @tarukosu proposed seems like the only valid option for now. Ironically glTFast used to work with separate shader graphs per surface type for older URP versions (and still does...see legacy shader graphs). Combining everything into one shader graph made a lot of sense from a maintenance standpoint though.
I don't have the capacity to offer a short-term solution, but this definitely needs to be addressed somehow.
Thanks
edit: Created #695 to track it.
Unity glTFast 6.4.0 was released with initial visionOS support!
OpenUPM release will follow later today.
The newest versions of KTX for Unity and Draco for Unity support visionOS as well.
Emission on materials is not working, that's a known issue on PolySpatial's side that'll be fixed soon. As a workaround you can change the mode on the emissiveFactor color to default (instead of HDR).
Thanks for your feedback!
Any news on how the emission is progressing on Polyspatials side right now and if we can expect any updates from that side in the near future?
Any news on how the emission is progressing on Polyspatials side right now and if we can expect any updates from that side in the near future?
To quote the Polyspatial 1.2.0 change log:
Fixed failure to transfer HDR color properties on shader graph materials on simulator/device.
I have not confirmed it yet though.
Ok, so I checked if there is a difference when using HDR or switching the emissiveFactor to default color mode. They look more or less the same. The reason why I thought it was not implemented yet, was because of my models being rendered incredibly dark compared to a non-PolySpatial build on other platforms or in Fully Immersive mode.
My models use an emissive Texture - but if that should work, I might wanna check again if it is more of a general lighting problem that I encounter.
Ok, so I checked if there is a difference when using HDR or switching the emissiveFactor to default color mode. They look more or less the same.
That's actually good. Previously it would render black (i.e. no emission), so it seems to be fixed.
The reason why I thought it was not implemented yet, was because of my models being rendered incredibly dark compared to a non-PolySpatial build on other platforms or in Fully Immersive mode.
On my end the simulator has this odd bug where it is very dark, but switching environment once suddenly makes things correct again. Anyways, I can offer to look into it. Feel free to make a dedicate issue about that.
My models use an emissive Texture - but if that should work, I might wanna check again if it is more of a general lighting problem that I encounter.
👍
On my end the simulator has this odd bug where it is very dark, but switching environment once suddenly makes things correct again.
Ohh, I can confirm that! Good to know. We first noticed that on the actual device though - will recheck with my colleague start of next week to see what it looks like on the device.
Just got the chance to recheck the lighting on the device. I can confirm now that the emission and lighting work properly. 👍 /
My issue was indeed the case of either the simulator encountering this bug which gets fixed when switching the environments. Similarly on the device the dark 3d-models were caused by a bad lighting situation of the room where the device was used in. When using the device in a properly lit environment, the models look just fine.
glTFast ought to work on Apple Vision OS, limited only by the platform's capabilities.
A quick test showed both results and some remaining issues: