Closed WestLangley closed 2 years ago
I think that displaying transmissive materials (as defined by KHR_materials_transmission
) through other transmissive materials may be considered a feature request. Displaying opaque materials and the environment map through the transmission — as we do now — was priority (1), displaying alpha blend materials through transmission is probably second (2), and transmission through transmission third (3).
Similarly, I believe I'm seeing (1) in https://sandbox.babylonjs.com/ and https://github.khronos.org/glTF-Sample-Viewer-Release/, with reflections only from the nearest transmission material. Adding (2) and (3) requires techniques like depth peeling; see comment by @MiiBond in https://github.com/mrdoob/three.js/issues/16977#issuecomment-836197846. This would certainly be a welcome addition, but we'd want the flexibility to configure the number of layers, or disable that entirely for performance.
- Also, in Chrome and Firefox (not Safari)
THREE.WebGLProgram: gl.getProgramInfoLog() WARNING: Output of vertex shader 'vUv' not read by fragment shader
Transmission always sets the
USE_UV
flag.
Fixed https://github.com/mrdoob/three.js/commit/ac6d30a393e6974c3cb15e8113303b99cbb18e72
- If performant, I think the dimensions of the "transmission render target" should match the dimensions of the "current render target". Doing so will prevent artifacts like the following:
Worth noting that this would only work in WebGL2 as we do need mipmaps.
Another problem with the current transmission code is that it "breaks" in transparent contexts.
Here I'm setting alpha: true
and renderer.domElement.backgroundColor = 'red'
:
/cc @elalish
I consider this pretty important. The DOM uses transparency pretty extensively which is why model-viewer uses alpha: true. This way DOM backgrounds and gradients and such fit nicely into the scene and the 3D doesn't have to look like a rectangle. Of course this is non-physical, so we'll need some kind of simple approximation. I'm hoping we can just get the shader to set the alpha to whatever the transmission % of the pixel was, though we'll have to be careful to keep it opaque if it was already drawn to by something opaque. It won't blur the DOM background, but I think that's fine, and more or less in line with not seeing transparent objects through each other.
Hello @takahirox, Thanks for the PR! I found that current example has gaps if look from a side view or back view: But the front view is OK. I tried with a simple box, seems all refracted pixels are pushed to +Z axis a little?
I tried with a simple box, seems all refracted pixel are pushed to +Z axis a little?
@takahirox Any ideas?
Unfortunately I currently have lack of time to tackle this problem.
I tried with a simple box, seems all refracted pixel are pushed to +Z axis a little?
This guess sounds right to me. The ray is calculated in
src/renderers/shaders/ShaderChunk/transmission_pars_fragment.glsl.js
I'm happy if anyone can help the investigation in the details.
Update: The model looks fine if thickness
is zero. So I suspect there is a problem in this function or the parameters passed to the function can be wrong.
I've made a model that shows the issue: TransmissionThickness.gltf.zip
khronos | babylon | dev |
---|---|---|
Thanks for making the test model! The model looks correct in glTF sample viewer to which I referred for the transmission shader code.
So I guess I have wrongly editted the shader code calculating the ray or passed wrong parameters to the function.
@UX3D-eckerlein Any ideas of what we got wrong?
@whatisor Is this something you dealt with in your implementation?
Probably I figured out the root issue. I have passed wrong normal parameter to the function. The function requires world normal but I have passed world normal applied view matrix.
This is the screenshot of Three.js + test model with the fix.
I will make a PR maybe soon.
Thank you for fixing the shader. I tried again with the latest version of the library. Looks good.
Library | IridescentDishWithOlives.gltf - closed | IridescentDishWithOlives.gltf - open |
---|---|---|
glTF Sample Viewer | ||
Three.js r131dev | ||
Three.js r131.1 |
I think that displaying transmissive materials (as defined by
KHR_materials_transmission
) through other transmissive materials may be considered a feature request. Displaying opaque materials and the environment map through the transmission — as we do now — was priority (1), displaying alpha blend materials through transmission is probably second (2), and transmission through transmission third (3).Similarly, I believe I'm seeing (1) in https://sandbox.babylonjs.com/ and https://github.khronos.org/glTF-Sample-Viewer-Release/, with reflections only from the nearest transmission material. Adding (2) and (3) requires techniques like depth peeling; see comment by @MiiBond in #16977 (comment). This would certainly be a welcome addition, but we'd want the flexibility to configure the number of layers, or disable that entirely for performance.
Hey @donmccurdy, Any idea when we could see (2) and (3) with three.js? I guess (2) is already implemented, because as far as I know, babylon.js can render (2), but not (3). Three.js doesn't support any. I am going to ask this in three.js as well, but at least from the GLTF implementation point of view, when can we see (3)?
@rohan-deshpande I'm not aware that anyone is working on three.js support for (2) or (3) at this time. There isn't really any work to do in GLTFLoader here, it'd be a matter of updating WebGLRenderer internal code I think.
@mrdoob What will it entail for WebGLRenderer getting updated to support this? Please let me know. I am ideally after (3), but even (2) would work well for the time being. I'd definitely want to contribute, but would love to get some directions or know where to start.
(2) is fairly easy, (3) is not...
How does your scene look like?
I have a pair of acrylic spectacles that I am working on to show on WebGL container. See the image below.
Three-
Babyon-
Only the stem has double mesh (frame will also have it). One has volume, transmission and ior, while other has only opacity via basecolourfactor property. This setup will enable me to see the stems through the frame and vice versa.
The ideal scenario would be to have (3), so I can have half of the polycount, but this also works for the time being.
The ideal scenario would be to have (3), so I can have half of the polycount
Any implementation of (3) is likely to involve rendering the model many times ("depth peeling") so I would expect that to reduce performance rather than improve it. Additional polycount is likely the more efficient option.
@donmccurdy I see. In that case 2 is ideal. @mrdoob how can I/my team help to get this implemented? It is kind of crucial for us.
@mrdoob @donmccurdy will it be possible to point us in the right direction so that we can may be create a PR?
I've been playing with the transmission texture recently and it's wonderful!
A couple suggested improvements
Use Linear filtering rather than Nearest for magnification on the transmission render target. You can see in this image hard egdes on pixels for a material with low roughness
Resolve (5) for now by exposing creation of the transmission target (see #22731) – I'm suggesting this rather than having three.js create transmission targets to match the viewport dimensions because of the complexities involved with this:
So making it possible for the user to handle those things if they need would be a nice solution without adding complexity into three.js
You can see the impact of these changes in this demo (toggle Fix Transmission)
Hi @mrdoob @donmccurdy, Would it be possible to get an idea or pointer in the right direction with regards to getting the alpha blend materials to show through transmission / volume? I am pretty much new to three js and would appreciate some help.
@rohan-percept sorry, I don't have pointers to get things started on (2) or (3).
Related: #23448
I'm not sure if (3) "transmission through transmission" necessarily requires depth peeling. Another approach would be to copy render contents to a secondary transmission target sources while rendering the transmission material meshes back to front. This would be a bit more limiting (transmission from the same mesh couldn't be seen through the other) but it should work on the above glasses model if the side pieces are separate meshes and I'd think be more performant than depth peeling.
The down side is that it would require another render target and render target copies. Consider the following case with three transmissive meshes sorted back to front A, B, C:
This is is a rough algorithm, at least. Not sure how how intensive a full target copy would be but maybe it's a starting point. Do we know what Babylon is doing?
Depth peeling is definitely a nice option but definitely expensive and I think a big first step would be getting the ability to use custom and change depth textures on render targets would be a big first step for that (#19554)
I think we could achieve something decent by doing:
Hmm, now that we're moving into WebGL2... Sooner or later we'll have to investigate rendering everything into a multisampled RT in linear and copying the RT into the framebuffer while applying gamma and tonemapping.
Although I think @Mugen87 tried this recently and it was slower than the current setup.
- Use RT2 as transmission input and render the backside of all doublesided transmissive meshes into RT1.
- Use RT1 as transmission input and render transmissive meshes into the framebuffer.
I don't know if this is going to have all that great of an effect and probably wouldn't work well on the sun glasses example above. If backfaces are rendered first and used for transmission when rendering front faces then every frontface transmission fragment is effectively going to have the transmission tint doubly applied (once for backside, once for frontside). And with refraction you'll probably get artifacts at the edges of the mesh when the refracted sample doesn't line up with the back face mesh edges.
And if only backfaces are used in the transmission then continuity of things like the top highlight caused by the front ridge of glasses side piece will be lost:
Might be worth trying out I just expect there to be more issues than you might expect.
Although I think @Mugen87 tried this recently and it was slower than the current setup.
Correct. Applying gamma and tone mapping in an additional render pass was noticeably slower than doing it inline. In certain examples, performance dropped from 60 to 40 FPS when using a FP32 render target, see https://github.com/mrdoob/three.js/issues/23019#issuecomment-1020493869.
Most of the issues raised in the original post have been addressed, so I think it is OK to close this now.
Reposting https://github.com/mrdoob/three.js/issues/21000#issuecomment-862597968 so as to consolidate some remaining issues.
webgl_loader_gltf_transmission
has thetransparent
property of each sphere to set tofalse
. Is that what the loader should be doing? What shouldtransparent
be set to whentransmission
is non-zero?After setting
transparent
totrue
, only front-faces show the hot-spot. (This may be a known limitation, but @mrdoob just added support for properly rendering back-faces of double-sided meshes.) There should be two hot-spots.FIXED: This is caused by issue 7. below.FIXED Also, in Chrome and Firefox (not Safari)Transmission always sets theUSE_UV
flag.FIXED: If performant, I think the dimensions of the "transmission render target" should match the dimensions of the "current render target".Doing so will prevent artifacts like the following:FIXED In the following code snippet,transmissionFactor
is applied togetIBLVolumeRefraction()
twice. I expect that is a physical modeling error.https://github.com/mrdoob/three.js/blob/dd6ab3cf310f78a8375b8b2037db12ce8e31e82c/src/renderers/shaders/ShaderChunk/transmission_fragment.glsl.js#L23-L28
FIXED Also in the above snippet,totalSpecular
has units of radiance, and therefore cannot be the correct argument because the function expects a unit-less quantity. A reasonable guess would bematerial.specularColor
, instead./ping @takahirox @donmccurdy