mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
101.59k stars 35.29k forks source link

ThreeJS Standard Material? (Combine Basic, Lambert, Phong and NormalMap into one.) #4271

Closed bhouston closed 7 years ago

bhouston commented 10 years ago

I was looking at the ThreeJS materials. There are four materials that are nearly the same thing. Basic, Lambert, Phong and NormalMap.

NormalMap seems to have all the features of Lambert and Phong and even has the ability to turn off things like Specular (to create a Lambert material.)

I think that Basic is a material where color is set into emissive and it isn't affected by lights.

Thus maybe four different materials could be combined into one.

I am wondering if it is worth keeping all of these materials around on the shader side of things.

I think that with proper use of DEFINES (as it is already done) we can have for the most part a single vertex / fragment shader pair for all standard materials. The variations can be completely controlled via the DEFINES rather than constructing new shaders via including snippets. I think that with proper use of DEFINES in the more complex shader you can still get the speed you would get from a simple Lambert shaders.

One full shader is a lot easier to maintain than three (or four) very similar shaders. (I also struggle trying to follow the code because of the way the snippets are combined -- I wonder if there is an alternative design possible here that is easier to grok.)

Even if we combined things, we could still keep the existing material classes around for creating specialized materials for those that do not want to deal with all the options explicitly. Or one can create a factory class for creating common sub-types of StandardMaterials. But underneath these would all refer to the same shader, they would just provide simplified parameters to users.

BTW I am open to other design suggestions regarding the shaders. I do not have that much experience with dynamic shader generation designs.

(I push this in part because the more we use a single shader for everything the more can group together objects into batches... Now there are some additional complications, specifically one needs to be able to control the variations without defines to do full batches, but that can come later as an option I think.)

bhouston commented 10 years ago

One of the reasons I want to combine the code is because of duplication like this for lighting:

In Lambert (about 200 lines of code):

https://github.com/mrdoob/three.js/blob/master/src/renderers/shaders/ShaderChunk.js#L506

Is nearly identical to this code in Phong (another 200 lines of code):

https://github.com/mrdoob/three.js/blob/master/src/renderers/shaders/ShaderChunk.js#L876

Which is again nearly identical to this code in NormalMap (another 200 lines of code):

https://github.com/mrdoob/three.js/blob/master/src/renderers/shaders/ShaderLib.js#L731

Having duplicate code that is mostly the same is hard to maintain and it is generally a violation of: https://en.wikipedia.org/wiki/Don't_repeat_yourself

I am open to other ways of removing the duplication though -- suggestions welcome. But I think it needs to be removed somehow to make this code easier to maintain.

tapio commented 10 years ago

Remark: Phong already supports normal maps, though it does it differently using standard derivatives instead of tangent attributes like the normal map shader. Personally I like using the same material "class" and toggling on/off functionality I need/don't.

bhouston commented 10 years ago

My preferred setup in the end would be a folder hierarchical full of shaders along with an preprocessor so that I could have #includes (for example, a common.glsl in the root folder) and it would in the end create a single JavaScript file that contains these in a map.

Thus something like this in the ThreeJS directory structure for me would be optimal:

/shaders
  common.glsl
  /standard
    config.json // configuration goes here
    fragment.glsl
    vertex.glsl
  /shadow
    config.json // configuration goes here
    fragment.glsl
    vertex.glsl
  /depth
    config.json // configuration goes here
    vertex.glsl
    fragment.glsl
  /cube
    config.json // configuration goes here
    vertex.glsl
    fragment.glsl
  /depthRGBA
    config.json // configuration goes here
    vertex.glsl
    fragment.glsl

Which would be accessible in the end from a library like:

ShaderLibrary.getShader( "cube" ); // returns config file, fragment and vertex shaders as a hash/object.
WestLangley commented 10 years ago

I agree in principle. I am not sure how to best achieve what you want, so I'll just make some comments...

If you look at ShaderLib.js, it uses "chunks", and the shader code is reused as much as possible... Yes, the normalmap shader may be able to reuse chunks more efficiently than it is.

One of the reasons I want to combine the code is because of duplication like this for lighting: Lambert... is nearly identical to ... Phong

Well, sort of. 'lambert' does it's lighting calculations in the vertex shader and 'phong' does them in the fragment shader, so the code is slightly different, and they use different "chunks".

Also, as @tapio said, the 'normalmap', 'skin', 'terrain' shaders requires attribute tangents. The 'phong' shader supports derivative tangents. So the code looks similar, but there are subtle differences.

Thus something like this in the ThreeJS directory structure for me would be optimal: config.json // configuration goes here

What is config.json. Where are the uniforms?

bhouston commented 10 years ago

@WestLangley wrote: "If you look at ShaderLib.js, it uses "chunks", and the shader code is reused as much as possible... Yes, the normalmap shader may be able to reuse chunks more efficiently than it is."

The issue is I have been looking at ShaderLib a lot. I do see the reuse but it is really hard to manage in the way we are doing it. I had to admit that writing shaders in the style of

   "line of code" +
   "line of code" +
   ShaderLib.chunks[chunk_name] +
   "line of code"

Is really challenging. I would much prefer to have a single *.glsl file with a separately usable library of functions in an include (common.glsl) that together I can run in a test environment and edit easily. I think we could easily make a little shader editor where you have text boxes for each file. I think it would be amazing for expanding the shader capabilities in ThreeJS.

There are very few contributions to the WebGLShaders.js file in ThreeJS, which is strange because this really should be hugely important to a WebGL library:

https://github.com/mrdoob/three.js/commits/master/src/renderers/WebGLShaders.js

I would argue that the lack of contributions is that it is fairly challenging to deal with this code, not that it is complete or perfect.

WestLangley wrote: "Also, as @tapio said, the 'normalmap', 'skin', 'terrain' shaders requires attribute tangents. The 'phong' shader supports derivative tangents. So the code looks similar, but there are subtle differences."

I think that the minor differences can be handled by DEFINEs relatively easily so that there is no code duplication.

I think that there is a lack of function usage in these shaders too, using a common function library is actually a nice way to share code and I think it does so in a more organized higher level fashion than the chunk system (the chunk system is sort of like inline functions where the inputs and outputs are implicit and opaque, it is sort of scary if you think about it.)

WestLangley wrote: "What is config.json. Where are the uniforms?"

Two very good questions. I was going to have the defines, uniforms and attributes exposed in the config.json file. Thus the full configuration of the shader will be specified by the config.json file.

So I am proposing creating a library of functions (common.glsl) that are shared between shaders, getting rid of the chunk system and instead using flat files (with a *.glsl extension) that uses a preprocesses that can handle a simple "#include" system, and finally make one standard shader for most visible triangle surfaces in ThreeJS (combing Lambert, Phong, Normal and Basic.)

I do think the result will be much more coder friendly and accessible and will encourage a lot more contribution on this front.

arose commented 10 years ago

I would like to insert my experience with writing custom shaders for ThreeJS using THREE.ShaderMaterial. To reuse as much as possible from ThreeJS. I wrote a simple function:

getShader = function( shaderStr ) {
    return shaderStr.replace( /#include\s+(\S+)/gi, function( match, p1 ){
        var chunk = THREE.ShaderChunk[ p1 ];
        return chunk ? chunk : "";
    });
};

This allows me i.e. to conveniently use the ThreeJS fog:

var uniforms = THREE.UniformsUtils.merge( [
    THREE.UniformsLib[ "fog" ],
    { ... }
]);

material = new THREE.ShaderMaterial( {
    uniforms: uniforms,
    vertexShader: getShader( ... ),
    fragmentShader: getShader( ... ),
    fog: true
});

where the fragment shader looks like this:

uniform lowp vec3 colorx;

#include fog_pars_fragment

void main() {
    gl_FragColor = vec4( colorx,  1.0 );

    #include fog_fragment
}

For the fog this works quite well, but for i.e. lights/shading its more complicated. I agree with @bhouston that it would help greatly if the input/output of a shader chunk would be documented or encapsulated in glsl functions.

bhouston commented 10 years ago

So I've started on polishing up the shaders using a common .glsl file along with one .glsl per fragment/vertex shader. It can be a lot nicer than it is right now. For example in normalmap fragment there is this code:

    if( enableDiffuse ) {
#ifdef GAMMA_INPUT
        vec4 texelColor = texture2D( tDiffuse, vUv );
        texelColor.xyz *= texelColor.xyz;
        gl_FragColor = gl_FragColor * texelColor;
#else
        gl_FragColor = gl_FragColor * texture2D( tDiffuse, vUv );
#endif
    }
    if( enableAO ) {
#ifdef GAMMA_INPUT
        vec4 aoColor = texture2D( tAO, vUv );
        aoColor.xyz *= aoColor.xyz;
        gl_FragColor.xyz = gl_FragColor.xyz * aoColor.xyz;
#else
        gl_FragColor.xyz = gl_FragColor.xyz * texture2D( tAO, vUv ).xyz;
#endif
    }

I have converted it to this:

    if( enableDiffuse ) {
        gl_FragColor = gl_FragColor * inputGamma( texture2D( tDiffuse, vUv ) );
    }
    if( enableAO ) {
        gl_FragColor.xyz = gl_FragColor.xyz * inputGamma( texture2D( tAO, vUv ) ).xyz;
    }

Because it now uses this function that encapsulates the INPUT_GAMMA define/option:

vec4 inputGamma( const in vec4 input_rgba ) {

#ifdef GAMMA_INPUT
    vec4 output_rgba = input_rgba;
    output_rgba.xyz * output_rgba.xyz;
    return output_rgba;
#else
    return input_rgba;
#endif

}
WestLangley commented 10 years ago

@bhouston +1 Can you show how, for example, a super-simple shader like 'normal' would be specified in your chunk-less, glsl-file approach?

bhouston commented 10 years ago

Well, I've started work on the normalmap shader. This is what I've refactored it to so far. I've refactored the input gamma, and lighting code so far.

Common.glsl: https://gist.github.com/bhouston/8303102 Normalmap.fragment.glsl: https://gist.github.com/bhouston/8303128

Now I am not anywhere near done, there are likely new bugs I've introduced, and I think I am missing some the common attributes (position, normal, etc) that are inserted later by ThreeJS into the shader code. But this is what I have so far.

I believe the above is much more easy to edit and refine than this:

https://github.com/mrdoob/three.js/blob/master/src/renderers/shaders/ShaderLib.js#L544

WestLangley commented 10 years ago

I believe the above is much more easy to edit

Agreed.

For historical reference, also see https://github.com/mrdoob/three.js/issues/2162 and https://github.com/mrdoob/three.js/issues/3768

bhouston commented 10 years ago

@WestLangley here is how I would combine all normal perturbation methods together in a universal shader:

    // calculate normal
    vec3 normal = normalize( vNormal );

#ifdef DOUBLE_SIDED
    normal = normal * ( -1.0 + 2.0 * float( gl_FrontFacing ) );
#endif

#ifdef USE_NORMALMAP_SIMPLE

    vec3 mapN = texture2D( normalMap, vUv ).xyz * 2.0 - 1.0;
    mapN.xy = normalScale * mapN.xy;
    normal = normalize( mapN * perturbNormal2Arb( -vViewPosition, normal ) );

#elif defined( USE_NORMALMAP_BINORMAL )

    vec3 normalTex = rgbaToNormal( texture2D( tNormal, vUv ), normalScale );
    mat3 tsb = mat3( normalize( vTangent ), normalize( vBinormal ), normal );
    normal = tsb * normalTex;

#elif defined( USE_BUMPMAP )

    normal = perturbNormalArb( -vViewPosition, normal, dHdxy_fwd( bumpMap, bumpMapScape, vUv ) );

#endif

I combined all options from Phong with those in Normalmap. Of course I haven't tested it so there may be subtle bugs. But I find the above fairly understandable, even though there are four options (simple normals, normal map, normal map with tangent/binormals, and bump map.)

WestLangley commented 10 years ago

So far, so good.

Not to derail your thread, but an upcoming step is to figure out a good way to extend, or modify, MeshPhongMaterial. Currently, that must be done by creating a ShaderMaterial using the "chunk-system", then figuring out which properties must be assigned to uniforms and which properties can remain as ShaderMaterial properties. Then, since WebGLRenderer sets the DEFINES for MeshPhongMaterial as a special case -- and does not do so for ShaderMaterial -- the user must manually set the necessary DEFINES for the ShaderMaterial "chunks" that the renderer does not set.

So a simple test would be to extend (or modify) MeshPhongMaterial by adding a decal map, for example, and see how it goes.

bhouston commented 10 years ago

@WestLangley. I would move all setting of defines outside of the renderer itself. The shader library will be introspectable for defines and attributes in addition the uniforms (which is right now the only introspectable parameter on shaders.) We could even specify the type of define as bool or integer (is there other types in use?) I hinted at this when I said that the available defines would be specified in the config.json files beside each shader.

safetydank commented 10 years ago

I like the idea of making the shader library more modular, keen to see where this goes. As an aside, @unconed wrote a nice shader graph prototype for assembling shaders from fragments, I wonder if some ideas are applicable here?

https://github.com/unconed/ShaderGraph.js

mrdoob commented 10 years ago

As in #4221, I'm all for this but I maybe it would be better to implement this in WebGLRenderer3 instead.

WestLangley commented 10 years ago

@bhouston Just for clarity, my understanding is that your intention is to combine the Phong, Lambert, and Basic shaders into one. But that MeshBasicMaterial, MeshLambertMaterial, and MeshPhongMaterial will, from an API standpoint, remain as-is. Is that correct?

(I hope so.)

brianxu commented 9 years ago

Hi guys, I think this might be a proper thread to add my thoughts about the Material stuff. I also found it very difficult to extend a existing material. Basically it's like what @WestLangley suggested. I think a better OOD might help here. It might be hard but if we have an unified route to set and update material, and make each material to implement the method such as the refreshUniformsXXX() the code will be much cleaner inside WebGLRenderer. Right now this is all done through the use of instanceof with quite a few if/else statements. I think in the ideal world the WebGLRender shouldn't care about what material it is, it should just be able to get shaders from the material and update the uniform by calling the methods implemented in the material. And I think it will be easier to customize. Does that make sense?

mrdoob commented 9 years ago

@brianxu What happens when using a renderer other than WebGLRenderer?

brianxu commented 9 years ago

@mrdoob I think you mean the CanvasRender. Not really familiar with it, but I think similar idea can be applied. But it can also be left as it is right now. Cannot it be?

bhouston commented 9 years ago

@mrdoob. I think the future is that all but WebGL-based renderers are deprecated. You are already heading there with CanvasRenderer in examples/js. ;) Maybe leave all existing Materials as is, and create a new WebGLMaterial that can be extended in this fashion.

@brianxu I like the idea. Can you prototype it?

brianxu commented 9 years ago

@bhouston thanks, I do want to prototype it, but I also would like to make sure it doesn't conflict anything on the road map of THREE. So I would like to hear more comments about it first.

mrdoob commented 9 years ago

Maybe leave all existing Materials as is, and create a new WebGLMaterial that can be extended in this fashion.

That sounds better. But you know, the main reason we even have WebGLRenderer was because the original design of the library was renderer-agnostic. Who knows? Maybe Microsoft ports Direct3D to all platforms and makes it open source an 100x faster than OpenGL in 2015? ;)

wilson0x4d commented 9 years ago

OT: ... or 2016, 2017 .. 2020.. after you've been coding for 20+ years you stop putting 'time constraints' on agnostic design patterns/decisions, because on a long enough timeline everything changes.

brianxu commented 9 years ago

QT has been doing this all these years...building whole set of libraries for different platforms. I suppose the question is more about whether this feature is useful and will enhance the development experience.

dmtaub commented 9 years ago

it seems safe to assume that as we create different renderers in the future, we will also need to either create new materials, or extend the existing materials' support for the available renderers.

as it is, it is fairly difficult to create a new material or subclass an existing one, because the renderer needs to be changed as well.

-m(obile)Dan On Dec 19, 2014 4:56 PM, "Shaun Wilson" notifications@github.com wrote:

bhouston commented 9 years ago

I think that there seems to be a lot of change if you are looking from ActionScript or JavaScript perspective -- Canvas, CCS3, WebGL all have come relatively recently. But if you look at it from the side of hardware accelerated rendering, change happens much slower. The GL/OpenGL/OpenGL ES/WebGL API has been around since 1992. And it only really had one big update -- the addition of vertex buffers and programmable shaders, which occurred between 2001 and 2004. Everything else has been pretty minor.

The future is probably another revamp of the OpenGL standard over the next couple years in the direction of Apple's Metal and AMD's Mantle library and DirectX 12, see https://en.wikipedia.org/wiki/OpenGL#OpenGL_NG All three of these keep around a GLSL-like language but change the initialization semantics. WebGL is a lagging standard compared to OpenGL so I would guess we are probably 3 to 5 years away from OpenGL NG?

Unreal Engine handles multiple targets (iOS, Android, PS4, XboxOne, Windows,) while keep programmable shaders by having shaders written in Unreal Script Format (*.usf) which can be translated into any of these targets (DirectX, OpenGL, etc..)

Programmable shaders are the future of all 3D GPU accelerated languages and standardizing on them is likely a decent idea.

brianxu commented 9 years ago

Thanks for your guys input, I'm going to prototype it and post back my update soon.

brianxu commented 9 years ago

Here is my initial prototype: https://github.com/mrdoob/three.js/pull/5812

bhouston commented 7 years ago

We have a StandardMAterial now. :)