Closed ClayQAQ closed 8 months ago
Here is the official screenshot: https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0/TransmissionTest#screenshot
This feature is fully supported by GLTFKit2, but as noted in the README, not all rendering engines support all glTF features. In this case, transmission is not supported by SceneKit, and is unlikely to ever be supported. However, we have recently improved our handling of base color factors, so I encourage you to update to the latest commit (59dcef1), as it greatly improves the SceneKit rendering of this asset.
These two spheres aren't transparent here. If we set alpha transparency, the model simply disappears? Is that why SceneKit don't use alpha directly? And how does other rendering engines implement transmission feature?
Those two spheres aren't transparent precisely because of the previously established fact that SceneKit doesn't implement transmission. There's nowhere to put the transmission parameters, and trying to hack them in on top of existing support for alpha blending won't produce a satisfactory result.
This library implements glTF import. It does not--and has not ever--purported to render every feature in a plausible or pleasing way. If you want good-looking transmission, I encourage you to look into what that requires. It's not easy, and it's not cheap, and that's why it is not and probably never will be supported by SceneKit.
Thank you. I see. Does the RealityKit support the "transmission" feature?
Unfortunately I don't think any 3D renderer included in the system supports transmission or volume rendering. It's been discussed on Twitter and the Apple Developer forums, but I haven't seen any progress on Apple's part in implementing it.
No alternative?That makes iOS less expressive with transparent materials, isn't it?
Whether native USD rendering can implement the transmission feature
It’s a good question. I’m not acquainted with the capabilities of the Metal implementation of the Hydra renderer, but I don’t think SceneKit or RealityKit can be expected to accurately render glossy transmission now or in the near future.
You said glossy transmission and transmission are the same? I'm curious why iOS doesn't want to render transmission feature accurately.
I read your twitter feed, which means that apple does gloss transmission will cost a lot of performance? But I think the iPhone's performance should be very high...
I think that rasterization-based transmission is always something of a hack. Most forward renderers implement it by rasterizing all opaque objects into an offscreen render target, then downsampling it, then rendering objects that need transmission. But this has obvious artifacts: transmissive objects only transmit/refract opaque objects, and only a single “layer” of transmissive surfaces can be rendered accurately. And then if you want to do this in a plausible way with stereoscopic rendering you pay the full cost of all those passes again.
So this is why I think Apple hasn’t invested in transmissive rendering. It’s obviously an improvement over alpha blending in certain narrow cases, but it’s a costly technique with many obvious downsides. Maybe the equation will eventually shift when real-time path tracing solutions become practical.
Since we support KHR_materials_transmission for import, and because we suppose that no Apple platform rasterizer will support glossy transmission in the foreseeable future, I propose closing this issue.
This issue will be closed automatically in seven days if no further response is received.
The GLTFKit2 only supports KHR_materials_transmission for import, but doesn't do anything about transmission?
Given everything I’ve explained above, what precisely do you think the library could do differently?
SceneKit doesn’t support transmission.
RealityKit doesn’t support transmission.
If you want transmission, you’re free to write your own Metal-based rendering engine that implements it, but that is far outside the scope of what a model import library should be responsible for.
The sample viewers in this repo are meant to be lightweight examples of how to map glTF features onto the capabilities Apple has cared to implement in their system-provided rendering engines. Apple doesn’t seem to care about transmission, so if you want it, write it yourself.
Okay, I understand.
Closing.
I put the KHR_materials_transmission model into the GLTFViewer. The display is strange, and different from Babylon. Does GLTFKit2 now support this feature? The link of the model is below: https://github.com/KhronosGroup/glTF-Sample-Models/blob/master/2.0/TransmissionTest/glTF-Binary/TransmissionTest.glb
GLTFViewer:
Babylon: