mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
102.02k stars 35.33k forks source link

Intent to implement/contribute Enterprise PBR improvements #16977

Closed bhouston closed 7 months ago

bhouston commented 5 years ago
Description of the problem

To give context to a series of PRs are we are making to Three.JS I wanted to explain. We are adopting Enterprise PBR (PBR Next) as it is also the new material model that glTF is standardizing on. Enterprise PBR is a unification of advanced PBR parameters beyond just roughness and metalness.

It is specified here: https://dassaultsystemes-technology.github.io/EnterprisePBRShadingModel/spec.md.html

The main additions to the Three.JS PBR model to adopt correctly Enterprise PBR are:

We are aiming to add these to the Node graph material system as I personally believe that is the future of material definitions (https://github.com/mrdoob/three.js/issues/16440). I think it is relatively easy to back-port them to PhysicalMaterial as well.

This will be useful for an advanced glTF loader in Three.JS.

I believe it is also the direction other projects are going, including Google's Filament. We are aiming to have our Three.JS contributes compatible with Filament directly when possible.

Three.js version
Browser
OS
Hardware Requirements (graphics card, VR Device, ...)
bhouston commented 5 years ago

/ping @DanielSturk - this gives context to your PRs so that people understand the overall motivation.

donmccurdy commented 5 years ago

Enthusiastic +1 from me! šŸ™‚

This link is currently broken, but the <model-viewer/> project has some render comparison infrastructure (for three.js and Filament) that may be helpful along the way.

elalish commented 5 years ago

Enthusiastic +1 from me!

This link is currently broken, but the <model-viewer/> project has some render comparison infrastructure (for three.js and Filament) that may be helpful along the way.

No longer broken!

donmccurdy commented 4 years ago

The glTF sheen extension (KHR_materials_sheen) is nearly complete, although still welcoming feedback for a bit longer. To implement in MeshPhysicalMaterial, it would require:

The two maps use .rgb and .a channels respectively; they can be combined.

donmccurdy commented 3 years ago

The glTF extensions KHR_materials_sheen and KHR_materials_transmission are now complete. Would be great if we could work toward supporting the remaining properties of those PBR features. See this article for a bit more info.

rawwerks commented 3 years ago

+1 for this, in particular a KHR_materials_transmission implementation would have a massive impact on rendering physical products.

i have no appreciation for how much work this is, but i would be curious @mrdoob if this is in line w/ your near-term roadmap for three.js?

re: implementation, the only support i can offer is that both https://github.com/BabylonJS and https://github.com/KhronosGroup/glTF-Sample-Viewer have successfully implemented all three of the new PBR extensions - so perhaps there are methods that can be borrowed from those projects.

also - the new Khronos "toy car" is an all-in-one test of a successful implementation: https://github.com/KhronosGroup/glTF-Sample-Models/tree/a35e94effc01db54f94bab34f793c960276a67fc/2.0/ToyCar

mrdoob commented 3 years ago

How should this be architected? Seems to me that KHR_materials_transmission would require this:

1 - Render opaque to a render target (front to back) 2 - Render transparent to a render target (back to front) 3 - Render refractive using the current render target (back to front)

Considering that WebGL1 doesn't support multisampled render targets we can't just use this architecture for everything. WebGL1 people will see either things aliased or things not refracting.

This could be the default architecture for WebGL2 though.

bhouston commented 3 years ago

The proposed suggestion only works for single layered transmission. I believe the more correct approach would be to implement an OIT method. OIT = order independent transparency, per https://github.com/mrdoob/three.js/issues/9977 It would then work on WebGL1 and 2.

Cesium project has OIT code: https://gitlab.sensoro.com/wushijing/cesium/blob/9fd4154a2eb3696f1c4c053ccf3a9b8354d683d4/Source/Scene/OIT.js. Clara.io has also implemented OIT.

https://cesium.com/blog/2014/03/14/weighted-blended-order-independent-transparency/

mrdoob commented 3 years ago

I'm not sure OIT solves refraction.

rawwerks commented 3 years ago

I might be mis-reading the documentation, but I believe that the KHR_materials_transmission extension solves the refraction part regardless of OIT: https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_materials_transmission#refraction

Therefore, correct ordering is not an absolute requirement when implementing this extension in realtime renderers, nor is rendering all potentially overlapping layers.

Is this example from the GLTF Sample Viewer (which also uses WebGL) relevant to the challenge here?

Transmission set from extension https://github.com/KhronosGroup/glTF-Sample-Viewer/blob/2e6f9f1cfef04239cc8c8c403a5c49a242b1dc3f/src/material.js

            // KHR Extension: Transmission
            if (this.extensions.KHR_materials_transmission !== undefined)
            {
                let transmission = this.extensions.KHR_materials_transmission.transmission;

                if (transmission === undefined)
                {
                    transmission = 0.0;
                }

                this.defines.push("MATERIAL_TRANSMISSION 1");

                this.properties.set("u_Transmission", transmission);
            }

then loaded into PBR shader https://github.com/KhronosGroup/glTF-Sample-Viewer/blob/2e6f9f1cfef04239cc8c8c403a5c49a242b1dc3f/src/shaders/pbr.frag


    #ifdef MATERIAL_TRANSMISSION
    vec3 diffuse = mix(f_diffuse, f_transmission, materialInfo.transmission);
    #else
    vec3 diffuse = f_diffuse;
    #endif

    color = (f_emissive + diffuse + f_specular + f_subsurface + (1.0 - reflectance) * f_sheen) * (1.0 - clearcoatFactor * clearcoatFresnel) + f_clearcoat * clearcoatFactor;

index of refraction is set by extensions.KHR_materials_ior and used here: https://github.com/KhronosGroup/glTF-Sample-Viewer/blob/2e6f9f1cfef04239cc8c8c403a5c49a242b1dc3f/src/shaders/ibl.glsl , which contains getIBLRadianceTransmission that is then used in 'pbr.frag'

@MiiBond - any ideas?

MiiBond commented 3 years ago

Hello. Yeah, OIT is about rendering transparent polys in the correct order (or approximately correct order in some cases). The most important requirement for the transmission extension is rendering transparency with the correct blending and refraction. Transmissive surfaces can both absorb and reflect light so it's not really possible to use traditional OIT methods as two different blend modes are needed. Also, refraction requires sampling from an already-rendered target. What we did for the Babylon loader was:

  1. Set up a target where only opaque objects are rendered in a first pass (this can be slightly lower resolution than the canvas and power-of-two dimensions so that mips can be generated every frame).
  2. The whole scene is then rendered normally and transmissive materials use the opaque render target to render when calculating refraction and absorption of the background scene. You can also use the mips to represent light transmitting through a rough surface.

This is very straightforward to do so it's what we suggested as the bare-bones approach to supporting the transmission extension. It's also, notably, what SketchFab seems to do for refraction. They also only support refraction of opaque objects.

If you want to get more complicated, you can combined some OIT techniques. I used depth-peeling, combined with MRT to render out a g-buffer (of sorts) and then composite them back together to render multiple layers of transparency with correct PBR blending. Here's an example scene: https://adobe.ly/33DOPoA

mrdoob commented 3 years ago

@MiiBond Super helpful! Many thanks! šŸ™

takahirox commented 3 years ago

Lately I have been studying PBR and locally tried transmission support as mentioned above first rendering opaque objects to a render target and then rendering a whole scene with the render target for refraction. I generate mipmaps of the render target every frame and use textureLod for rough transmission. From the screenshot, the basic concept seems fine.

image

mrdoob commented 3 years ago

@takahirox That is looking great!

takahirox commented 3 years ago

And other devs seem to work on it, too... https://github.com/mrdoob/three.js/issues/21000#issuecomment-832060947

mrdoob commented 3 years ago

@takahirox Would you like to do a PR with what you have? Maybe @whatisor can then follow it with further improvements.

takahirox commented 3 years ago

OK if @whatisor doesn't mind. (@whatisor, please let me know if you want to make from scratch.)

And before making a PR I would like to build a consensus on API and implementation. I would write up my suggestion here soon, hopefully today or tomorrow.

whatisor commented 3 years ago

@takahirox Sure, please share your source, I think I can learn much from that to improve mine.

takahirox commented 3 years ago

As I wrote I think @MiiBond's approach is good

  1. Set up a target where only opaque objects are rendered in a first pass (this can be slightly lower resolution than the canvas and power-of-two dimensions so that mips can be generated every frame).
  2. The whole scene is then rendered normally and transmissive materials use the opaque render target to render when calculating refraction and absorption of the background scene. You can also use the mips to represent light transmitting through a rough surface.

What I want to discuss about API and implementation are...

1. The reason why only opaque objects

First I would like to build a consensus why we render only opaque objects to a render target in a first pass, not all objects.

My understanding is it is good balance between performance and quality. Three.js is primarily designed as real-time 3d engine. We should adopt an efficient approach even if we sacrifice a perfectness.

KHR_materials_transmission specification also mentions

We recommend that client implementations aim to display at least opaque objects through a transmissive material.

https://github.com/KhronosGroup/glTF/blob/master/extensions/2.0/Khronos/KHR_materials_transmission/README.md#implementation-notes

So I think it's good for us to go with this approach and we can revisit later if we get a lot of requests for better transparency.

2. Who sets up the render target? User or Renderer?

We need to add a new property for the opaque objects render target to MeshPhysicalMaterial. I call the property transmissionSamplerMap here so far.

Who sets up the render target?

2-a. User

renderTarget = new WebGLRenderTarget(1024, 1024, {
    generateMipmaps: true,
    minFilter: LinearMipmapLinearFilter,
    magFilter: Nearest,
    wrapS: ClampToEdgeWrapping,
    wrapT: ClampToEdgeWrapping
});

const render = () => {
    scene.traverse(obj => {
        if (obj.material) {
            if (object.material.transparent) {
                obj.material.visible = false;
            }
            if (object.material.transmission) {
                obj.material.transmissionSamplerMap = null;
            }
        }
    });

    renderer.setRenderTarget(renderTarget);
    renderer.render(scene, camera);
    renderer.setRenderTarget(null);

    scene.traverse(obj => {
        if (obj.material) {
            if (object.material.transmission) {
                obj.material.transmissionSamplerMap = renderTarget.texture;
            }
            obj.material.visible = true;
        }
    });

    renderer.render(scene, camera);
};

Pros: Renderer won't be complex because we don't need to add the first opaque render pass.

Cons: User code will be complex. If users don't correctly set up they don't get expected result.

2-b. Renderer

Renderer sets up the render target in .render().

Pros: No user code change. And even if we optimize or improve the transmission later users automatically get the benefit.

Cons: Renderer can be a bit complex.

2-c. Hybrid

If material.transmissionSamplerMap is set by user, renderer uses it. Otherwise renderer sets up the render target.

I prefer 2-b or 2-c.

3. How should Renderer know whether it needss to set up the render target?

If we adopt 2-b or 2-c, renderer needs to know whether it needs to set up the render target or not. How should renderer know that?

3-a. Add a new renderer property and user sets it true

renderer.transmission = true;

3-b. Renderer automatically detects it in .render() like

let needsTransmissionSamplerMap = false;
scene.traverse(obj => {
  if (obj.material && obj.material.isMeshPhysicalMaterial && obt.transmission) {
    needsTransmissionSamplerMap = true;
  }
});

Perhaps 3-a is good?

4. How should envMap cooperate with new transmission?

This is what I couldn't deeply think of and look into yet. envMap also implements refraction. How should envMap cooperate with new transmission? Is just mixing them good enough? Or if both are set should only either one have an effect?


@whatisor

@takahirox Sure, please share your source, I think I can learn much from that to improve mine.

Mine is based on glTF sampler viewer, too. I read through your code and it looks very similar to mine.

MiiBond commented 3 years ago

@takahirox the reasoning of just rendering the opaque layer is that rendering all transparency correctly requires layering up all the transparent objects in the scene using something like depth peeling or other techniques. This can be expensive and it's hard to know how many layers to render. There are in-between approximations, of course, as well (like just rendering one layer of transparency behind the main one). For our web viewer for published Adobe Dimension projects, I actually render up to 8 layers currently using a modified version of Babylon.js. I do this because users often render projects that contain glasses and clear plastics so maintaining the look is important. I basically render the multiple layers to a g-buffer using depth peeling and then composite them together to get the final RT that will be the "refraction texture" for the final render. It can be expensive, of course, but can be accumulated over frames. https://adobe.ly/33DOPoA We also store the depth for each peel and use that to calculate something approaching the actual thickness for volume-based effects. https://adobe.ly/2ODrueG

mrdoob commented 3 years ago

@takahirox

1. The reason why only opaque objects So I think it's good for us to go with this approach and we can revisit later if we get a lot of requests for better transparency.

šŸ‘

2. Who sets up the render target? User or Renderer? 2-b. Renderer

šŸ‘

3. How should Renderer know whether it needss to set up the render target?

MeshPhysicalMaterial already has a transmission property (float). So I think we can just check for transmission > 0.0.

We could add a new transmissive array here:

https://github.com/mrdoob/three.js/blob/dev/src/renderers/webgl/WebGLRenderLists.js#L61

And then this function could be like this instead:

function push( object, geometry, material, groupOrder, z, group ) {

    const renderItem = getNextRenderItem( object, geometry, material, groupOrder, z, group );

    if ( material.transparent === true ) {

        if ( material.transmissive > 0.0 ) {

            transmissive.push( renderItem );

        } else {

            transparent.push( renderItem );
        }

    } else {

        opaque.push( renderItem );

    }

}

We can then:

  1. Render opaque to framebuffer .
  2. Render opaque (again) to transmissionSamplerMap.
  3. Render transmissive to framebuffer.
  4. Render transparent to framebuffer.

Something like this here:

const opaqueObjects = currentRenderList.opaque;
const transparentObjects = currentRenderList.transparent;
const transmissiveObjects = currentRenderList.transmissive;

if ( opaqueObjects.length > 0 ) renderObjects( opaqueObjects, scene, camera );

if ( transmissiveObjects.length > 0 ) {

    this.setRenderTarget( transmissionSamplerMap );
    renderObjects( opaqueObjects, scene, camera );
    renderObjects( transmissiveObjects, scene, camera );

}

if ( transparentObjects.length > 0 ) renderObjects( transparentObjects, scene, camera );

4. How should envMap cooperate with new transmission? Is just mixing them good enough?

If anything, it's a good start I think?

elalish commented 3 years ago

Yes, I like this approach! Regarding how transmission mixes with envMap, I would say what we're really doing (conceptually) is modifying the envMap by rendering the opaque objects onto it - thus blocking some envMap light and replacing it with reflected object light. They are both about gathering the incoming light that passes through transmissive surfaces, and they both get blurred via mipmapping of some kind. The difference is that the opaque objects are only rendered in the view frustum since the transmissive light vectors will stay in that frustum, and the envMap can simply show through by being rendered first into the transmissionSamplerMap.

When we start adding refraction on top of this, things will get a little more interesting, since then the transmission rays won't stay in the view frustum anymore. It's basically the same error as how we don't show reflections of other objects in the scene, which I think is fine for real-time. However, we'll want to be careful not to introduce noticeable artifacts as the refracted rays pass the boundary of the transmissionSamplerMap.

mrdoob commented 3 years ago

@elalish I think with envMap you mean the background? As in, rendering the background in the transmissionSamplerMap first?

I think @takahirox is referring to how the IBL mixes with transmission, but I'm not sure.

elalish commented 3 years ago

@mrdoob I suppose that's right, though it brings up an interesting question: what is the relationship between IBL and background? We see the IBL (not the background) when light is reflected from a surface, so it seems odd to see the background instead of the IBL when refracting through a surface. And of course a background is generally just a screen-toned image, not really a source of light like an IBL is, so it may be difficult to fit it into any linear rendering equations. Still, from an artistic (rather than physics) point of view, I'd guess having transmission sample only the background and not the IBL would probably make the most sense. For sensible physics the IBL and background need to be the same anyway.

takahirox commented 3 years ago

(Give me some more time to reply, I'm fighting to the side effects of the 2nd vaccine now.)

mrdoob commented 3 years ago

@takahirox Take your time!

takahirox commented 3 years ago

Sorry for the late response but I think I overcame the side effects.

  1. Who sets up the render target? User or Renderer? 2-b. Renderer

šŸ‘

So users don't need to be aware of transmissionSamplerMap. So what do you think of not exposing this property? For example, adding _ prefix material._transmissionSamplerMap to indicate it's a private property, undocument it, and non serialization.

MeshPhysicalMaterial already has a transmission property (float). So I think we can just check for transmission > 0.0.

We could add a new transmissive array here:

https://github.com/mrdoob/three.js/blob/dev/src/renderers/webgl/WebGLRenderLists.js#L61

That sounds good to me. I thought adding a new objects traversal for transmission detection was costly but yeah we already have projectObject().

if ( material.transparent === true ) {

    if ( material.transmissive > 0.0 ) {

        transmissive.push( renderItem );

    } else {

Can we really expect that transmission > 0.0 materials are always transparent = true? I think no guarantee in our API.

Render opaque to framebuffer . Render opaque (again) to transmissionSamplerMap. Render transmissive to framebuffer. Render transparent to framebuffer.

I think backgrand needs to be rendrerred to transmissionSamplerMap, too, doesn't it?

Regarding envMap, sorry I think I wrote poorly. What I thought was the new transmission system will implement refraction. And our existing envirionment map also implements refraction. If we have two refractions differently implemented and apply both, they may look weird.

I'm thinking of not touching envMap stuff In the first PR, would like to think of it later.

takahirox commented 3 years ago

Made a MeshPhysicalMaterial transmission support improvement PR #21884

mrdoob commented 3 years ago

Sorry for the delayed response.

So users don't need to be aware of transmissionSamplerMap. So what do you think of not exposing this property? For example, adding _ prefix material._transmissionSamplerMap to indicate it's a private property, undocument it, and non serialization.

Do we need to add it to the material? Can we have a transmissionSamplerMap inside WebGLRenderer instead?

Can we really expect that transmission > 0.0 materials are always transparent = true? I think no guarantee in our API.

I think it's up to the user/loader to set transparent to true. That's how we deal with transparency in textures too.

I think backgrand needs to be rendrerred to transmissionSamplerMap, too, doesn't it?

Yep!

I'm thinking of not touching envMap stuff In the first PR, would like to think of it later.

Sounds good!

takahirox commented 3 years ago

Do we need to add it to the material? Can we have a transmissionSamplerMap inside WebGLRenderer instead?

Good idea. I updated the PR.

I think backgrand needs to be rendrerred to transmissionSamplerMap, too, doesn't it?

Yep!

I reviewed WebGLBackground and realized that it unshifts the background to the opaque render list so just renderObjects( opaqueObjects, scene, camera ); also renders background.

Mugen87 commented 7 months ago

Most of original feature list has been implemented over the years in MeshPhysicalMaterial. Anisotropy, clear coat, iridescence, sheen, specular, transmission and advanced reflectivity has been implemented based on the respective glTF spec.

Please file a new issue if an additional enterprise PBR feature should be added.