Closed sunag closed 2 years ago
Interesting!
@sunag We are very interested in something like this. How can I help? I can add support for Standard at least.
I am interested in creating fairly arbitrary graphs, so that the intermediate nodes also take inputs. So you can have a graph that looks like this:
A Texture(tex1, uv1) B Texture(tex2, uv2) C Blend(A,B, mode) D Noise(param1, param2) E Blend(C,D, mode)
And then use that final node E, as an input to a Material.
So it is very arbitrary, not limited to just textures.
My goal would be to help create this in the next few weeks, hopefully collaborating with you in the next week or so if possible. I was looking at doing this via the shadergraph library here: https://github.com/mrdoob/three.js/issues/7339 But I am find creating it within ThreeJS directly. Really any solution is good as long as it is flexible and it works.
I'm reading through your code, it is quite nicely designed. I have some initial feedback. Could I start a PR using your code as a base start collaborating on it?
(1) I'd have the material resolve the references. Basically references would have names that they would ask their material to resolve, and the material would give back a snippest of code on how to access that data. This would also allow the material to know what variables (uniform/varyings) are used the nodes, so it can optimize appropriately. This also allows for different materials to resolve references differently, thus making the nodes more portable, rather than having to know how to the materials implement things, especially when there are differences between fragment and vertex implementations.
(2) I try to use the GeometryContext object I created in the lights refactor, it gives consistent access to a lot of the required local variables. But of course that can be resolved by the material itself.
(3) I've have UV just be another reference, which is resolved by the material. And I would have NodeTexture should actually take a Node Input, thus allowing for procedurally generated UVs.
(4) I would call NodeCube, NodeTextureCube to be consistent with the rest of Three.JS. And I would remove the logic on how to actually go the ray casts from it. But I like the idea of standard cube maps, so I would actually not put the environment query for specular or diffuse in the nodes, but have that in the base phong material, and you can only control the normal used for the query, or the cubemap result itself (thus allowing it to be a procedurally determined color.) Does that make sense? So I would have the normal pluggable in the material and the cube texture pluggable (queriable by a direction and bias/lod, and returns a color). Thus one can provide a cube texture to the irradiance map and another to the specular map. We can swap out true sampleCube with @tschw's cubeToUV2 function as just a node swap.
(5) I'd try to add a NodeFunction that allows one to call arbitrary functions with parameters as an addition to your NodeOp (or maybe they could be merged in some fashion.)
(6) I'd get rid of all of the verbose NodeNormal, NodeTransform, NormalMap, etc individual class and just have some simple constructors that create a NodeReference with a name that is resolved by the material as appropriate. NodeReference could resolve uniforms, varyings as well as computed values in the shader.
(7) I do not understand the difference between NodeEnvironment and NodeCube. I think NodeEnvironment may be incomplete?
(8) It is confusing to have NodePhong not be derived from NodeMaterial. Although I see that NodeMaterial is derived from ShaderMaterial. I wonder if you called the direct derivative from ShaderMaterial, GraphMaterial (or NodeGraphMaterial) that would make more sense -- because all together the nodes form a graph, and it is the graph that becomes the material, not an individual node.
(9) I would suggest maybe some more varied terminology. I'd call the root node, MaterialNode, and one could derive PhongMaterialNode from it. I've have Vector3Node, FloatNode, etc derived from ValueNode -- not necessarily constant, but just a value. Thus one could pipe in three FloatNodes to a Vector3Node. I think you can have a helper that would make declaring each of these one line or so rather than the 10 or so currently.
(10) I would move the name "Node" from the start of the class names to the back because that is how it is in the rest of the ThreeJS project.
(11) I would create the new MaterialNode class and it would be initialized with a list of uniforms and varyings. It would be default be able to resolve this and also track which it has resolved so one can track which features are needed. One could thus have a limited resolve in the derived PhongMaterialNode that would resolve the special cases and rely on the underlying class to do the simple ones (varyings, uniforms.)
(12) I am sort of confused between the difference between NodePhong and NodePhongMaterial. I didn't realize there was both until now.
(13) there is code like this:
THREE.NodeGLPosition.prototype = Object.create( THREE.Node.prototype );
THREE.NodeGLPosition.prototype.constructor = THREE.NodeGLPosition;
THREE.NodeGL.prototype.generate = function( material, shader ) {
But above this snippet you already defined generate
for NodeGL
and you didn't define one for NodeGLPosition
-- thus I think it is a copy-paste-edit error.
(14) I would get rid of NodeReflectUVW and NodeRefractVector and instead just make this something one can request from the material via a Reference resolve. Calculating a reflection vector is straight forward. I have added it to GeometryContext
in my unmerged experimental ThreeJS branches.
(15) The way I would implement reflection and refraction would be to have them as color inputs on the Material. One would resolve Refect, ReflectLOD, and Refract, RefractLOD in the simple way you would resolve any variable, and then pass them into one's texture cube equivalent (procedural or samplerCube-based) and then pass the resulting color into Material. Is that how you were doing it?
(16) I'm confused about the light input -- usually one doesn't have lights being pluggable, rather the light parameters are fully defined in the light class. I guess you need this additional flexibility? How do you envision it.
@bhouston woow, thank you very much feedback. I will need several posts to answer :)
I am interested in creating fairly arbitrary graphs, so that the intermediate nodes also take inputs. So you can have a graph that looks like this: A Texture(tex1, uv1) B Texture(tex2, uv2) C Blend(A,B, mode) D Noise(param1, param2) E Blend(C,D, mode)
Currently the syntax like this. Uv1 offset animate example: I think that NodeMaterial to MaterialNode and THREE.PhongMaterialNode It would be better too.
var uv2 = false;
var uv_offset = new THREE.NodeFloat(0);
var uv = new THREE.NodeOperator( '+', new THREE.NodeUV( uv2 ), uv_offset);
var texture = new THREE.NodeTexture( imgTexture, uv );
nodematerial.color = t;
// onUpdate
uv_offset.number += .01;
I think reverse the order with your suggestion get better (mode,A,B) to (A,B,mode). I am in the process of creating the reflex maps, cubemap and others...
The environment and Cubemap are incomplete.
Currently the bugs it may happen more because of the format converter still unfinished. This is responsible by vector conversion. vec3 to vec4 or vec4 for example.
https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/index.html#L365
A Blend "texture" for example: ( I have not tested this code ) It can be implemented in the same of a THREE.NodeOperator
https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/index.html#L1105
THREE.NodeBlend = function( a, b, mode ) {
THREE.NodeInput.call( this, 'blend' );
this.mode = mode;
this.a = a;
this.b = b;
};
THREE.NodeBlend.prototype = Object.create( THREE.NodeInput.prototype );
THREE.NodeBlend.prototype.constructor = THREE.NodeBlend;
THREE.NodeBlend.prototype.generate = function( material, shader, output ) {
var a = this.a.build( material, shader, output );
var b = this.b.build( material, shader, output );
switch(this.mode)
{
case 'multiply':
return this.format( '(' + a + '*' + b + ')', this.a.type, output);
break;
}
return a;
};
.generate() is the responsible for the code generator. The calcs codes are stored in a cache if you want to use in more than one input without losing performance.
Still I do not set up pointers or constant for optimization...
The compilation is done by propagation in build() for vertex and fragment code.
I can put you as a collaborator? If you want to edit the code in any way, I will be working on it as well.
I can put you as a collaborator? If you want to edit the code in any way, I will be working on it as well.
Thanks! I'll make PRs to yours so you can approve the changes.
I've added you (as well as @mrdoob, @WestLangley and @tschw) to a side project of mine that is attempting to define a set of reusable nodes and material definitions that can be transferrable between various renders. It is mappable onto this shader graph system you've created.
I do not think you have to pay attention to the repo I just gave you access to if you do not want to. It is what I am interested in implementing on top of this.
(2) I try to use the GeometryContext object I created in the lights refactor, it gives consistent access to a lot of the required local variables. But of course that can be resolved by the material itself.
I wish that the lights are one LightNode. My concern is to harness the code already developed for Three.JS.
(3) I've have UV just be another reference, which is resolved by the material. And I would have NodeTexture should actually take a Node Input, thus allowing for procedurally generated UVs.
You can replace UV to a vec2 would it be this?
(5) I'd try to add a NodeFunction that allows one to call arbitrary functions with parameters as an addition to your NodeOp (or maybe they could be merged in some fashion.)
this would be great. mainly for a BlendNode.
(6) I'd get rid of all of the verbose NodeNormal, NodeTransform, NormalMap, etc individual class and just have some simple constructors that create a NodeReference with a name that is resolved by the material as appropriate. NodeReference could resolve uniforms, varyings as well as computed values in the shader.
In this line of thought I think the MaterialNode could be a base of material Phong and Physical material.
(7) I do not understand the difference between NodeEnvironment and NodeCube. I think NodeEnvironment may be incomplete?
I still can not finish these Nodes.
(8) It is confusing to have NodePhong not be derived from NodeMaterial. Although I see that NodeMaterial is derived from ShaderMaterial. I wonder if you called the direct derivative from ShaderMaterial, GraphMaterial (or NodeGraphMaterial) that would make more sense -- because all together the nodes form a graph, and it is the graph that becomes the material, not an individual node.
NodeMaterial would be the root node material, it is necessary to use a node for vertex and fragment. NodePhong is hibrid and NodePhongMaterial is only a proxy class. This can then be merged.
(9) I would suggest maybe some more varied terminology. I'd call the root node, MaterialNode, and one could derive PhongMaterialNode from it. I've have Vector3Node, FloatNode, etc derived from ValueNode -- not necessarily constant, but just a value. Thus one could pipe in three FloatNodes to a Vector3Node. I think you can have a helper that would make declaring each of these one line or so rather than the 10 or so currently.
Sounds good.
(16) I'm confused about the light input -- usually one doesn't have lights being pluggable, rather the light parameters are fully defined in the light class. I guess you need this additional flexibility? How do you envision it.
This would be for the lightmap or a possible LightNode.
This would be for the lightmap or a possible LightNode.
I like the idea of a pluggable lightmap because one could define the UVs for it explicitly. :)
@bhouston Fix several corrections today in this file: But still has a lot to do. https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/three.node.js
This is the playground that I am creating :art: Textures and buttons is a drag and drop, works in chrome only
Amazing, stuff! Holy crap! It is beautiful.
Would it be possible to share the code in a way that I can also contribute? As a public PR or something?
You are working with the Sea3D project here right?
https://github.com/sunag/sea3d/tree/gh-pages/Labs/Three.JS-NodeMaterial
So I can just fork it and start contributing? Would you accept PRs? How can we effectively collaborate.
I haven't asked but @mrdoob probably (?) would love to have this within the ThreeJS project itself.
Definitely!
So I can just fork it and start contributing? Would you accept PRs? How can we effectively collaborate.
Of course, I think your help would be amazing. I also have to bring other nodes types, like saturation, noise as you suggested.
You are working with the Sea3D project here right?
I think in making a PR for Three.JS with examples so all this is defined.
@mrdoob What do you about the materials names, THREE.MaterialNode or THREE.NodeMaterial?
It's a type of material, so it should be THREE.NodeMaterial
.
Up. I think I'm near to a PR? https://github.com/sunag/sea3d/commit/55cf70aea38b2b72110abf96f7b96924dad81c54
Drag this cubemap to playground and any other texture for tests https://raw.githubusercontent.com/mrdoob/three.js/master/examples/textures/skyboxsun25degtest.png
a rim shader example
area reflection example
This is so awesome @sunag!
@bhouston thanks! you think it will be difficult to convert to R74?
This is very impressive work! Reminds me of shaderforge.
Very good job so far!
@sunag It will be a bit of work, but I would like to to help and most of the big structural changes in R74 shader code are my fault. :)
@GGAlanSmithee it is also very similar to what UE4 has: https://docs.unrealengine.com/latest/INT/Engine/Rendering/Materials/ExpressionReference/index.html
looks beautiful man and fun to play with: might I offer for inspiration Blender3D node editor. I find it super efficient, there are even some great videos on PBR via Blender's node system and what nodes would be most useful for creating to make PBR from scratch:
https://www.youtube.com/playlist?list=PLlH00768JwqG4__RRtKACofTztc0Owys8
It will be a bit of work, but I would like to to help and most of the big structural changes in R74 shader code are my fault. :)
@bhouston Wow. It was much more clean with the new r74 changes. I finished the first part still missing StandardMaterial. Any problem I post here : ) Thanks https://github.com/sunag/sea3d/commit/d544ad7993272348f8bbea2337cdceb52159a6a8
Another thing we are still missing is refraction. I really like to use the rendering buffer in place of a Cubemap RTT as default. It would be much more efficient.
@GGAlanSmithee I have some references of ShaderFX that I am a big fan. Shader Forge and UE4 mainly are also great references.Thnks!
@richardanaya Amazing videos. Thks!
@mrdoob In which folder do you recommend put these files? three.js root ( src/materials/node ) or in examples ? https://github.com/sunag/sea3d/tree/gh-pages/Labs/Three.JS-NodeMaterial/node
I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial". Both of these names make it clear that the material contains nodes, rather than being a Node itself.
I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial". Both of these names make it clear that the material contains nodes, rather than being a Node itself.
For me both look good. I leave the decision to @mrdoob what do you think?
BTW @sunag I'd stick with cubemaps for refractions if possible, it is easier and more accurate. I think that is how nearly everyone else does it and we need the RTT stuff for accurate reflections as well. I think it just needs to be a fast render at 128^2 or 256^2.
BTW @sunag I'd stick with cubemaps for refractions if possible, it is easier and more accurate. I think that is how nearly everyone else does it and we need the RTT stuff for accurate reflections as well. I think it just needs to be a fast render at 128^2 or 256^2.
Yes, we can let both. would be a discussion between performance x accuracy. Still for plane refraction (glass, water) I recommend the buffer in place of a CubeMap (for most cases).
NodeStandard to analize it is necessary to drag the CubeMap. https://raw.githubusercontent.com/mrdoob/three.js/master/examples/textures/skyboxsun25degtest.png http://sea3d.poonya.com/flow/
Source: https://github.com/sunag/sea3d/blob/gh-pages/Labs/Three.JS-NodeMaterial/node/NodeStandard.js
@mrdoob In which folder do you recommend put these files? three.js root ( src/materials/node ) or in examples ?
I would put it in examples to start with. Once it gets well defined we can later move it to src 😊
I'd still recommend calling this a "Material Graph" or in inverted ThreeJS style, a "GraphMaterial." Or if you insist on using the term "Node", I'd call it "NodeBasedMaterial".
For me both look good. I leave the decision to @mrdoob what do you think?
I kind of like NodeMaterial
already... Maybe NodesMaterial
? NodeGraphMaterial
? @WestLangley any suggestions?
I would rename the current nodes though... NodeColor
, NodeFloat
, NodeTexture
, ... to ColorNode
, FloatNode
, TextureNode
I can't get this to load 😐
I would put it in examples to start with. Once it gets well defined we can later move it to src :blush:
@mrdoob it will be great.
I would rename the current nodes though... NodeColor, NodeFloat, NodeTexture, ... to ColorNode, FloatNode, TextureNode
I will forward it then.
this is the local url :blush: , try this: http://sea3d.poonya.com/flow/
@WestLangley any suggestions?
I suggest THREE.FlowMaterial
.
My second choice would be THREE.CustomMaterial
.
As a completely random bystander. NodeMaterial sounds very intuitive to me, because that's what they are called in Blender3D
If @mrdoob likes NodeMaterial, we can stick with it. :)
NodeMaterial
it is then 😁
This project is awesome! Just saw the demo on Twitter and it's very impressive.
I wonder if the nodes should be something that go into Three core. They're very implementation specific. I too am building a Three.js shader graph editor (not yet released) for ShaderFrog.com, and the solution I have is to just export the GLSL code and all needed metadata like uniform names, into a little JSON file, and load it with an external runtime library
This graph editor can work with full shaders by analyzing their source code, meaning no specific shader node types are required. Could this NodeMaterial type be handled entirely outside of Three's core as well? All you really have to output is a RawShaderMaterial
for someone to use it in their own project.
Could this NodeMaterial type be handled entirely outside of Three's core as well? All you really have to output is a RawShaderMaterial for someone to use it in their own project.
@DelvarWorld Hi. Yes in theory, but it is very early to make a good shader from raw. At the time is better with an initial interface. It also helps to keep compatible with Skin / Morph and others natives components of Three.JS.
I thought to note a minor issue: In "Flow" the connectors don't pop to top level when dragging nodes over each other. Maybe this issue is waiting for layers or something still.
I wonder if there is a way to unify both approaches? I am interested in a multi-layered shader and for that you need to have multiple BSDFs that contribute towards a final result. This means that one needs to separate out the shading model more -- right now it is pretty tightly coupled in @sunag's current design. I think we should head in a direction where one doesn't need to have a Standard or Phong material specified, it could be raw like what @DelvarWorld has. I think we can move there incrementally though, so what @sunag has is a good start.
This is not a fleshed out answer, but a while ago I created a rough proposal for a portable shader format that includes metadata, such as uniform names, their types, their type in Three.js, etc. https://github.com/DelvarWorld/ShaderFrog-Runtime/blob/master/THREE_SHADER_FORMAT.md
All a shader really needs to run in a real environment is the raw shader source code (since the GPU compiles it) and, for convenience, what uniforms the user can set with what values. This rough proposal does not include any notion of building a shader programmatically, it's just a simple delivery for GLSL and metadata. It's compatible with Three.js because you just have to put it into a RawShaderMaterial
in the end.
Currently there is no way to make shaders portable in Three.js or export them from any application, which is what gave me the idea to propose a standard, and I think it could solve both of our problems, since we're building external applications that in the end spit out a predetermined material. It also means the implementation details of compiling specific combinations of a graph are left up to applications, not Three.js.
Does this proposed shader standard have a place in Three? I have no idea. Right now it's probably more useful for me than for Three's core, since Three builds its own shaders its own way.
@DelvarWorld I did start this project on a way to create a standardized set of shader graph nodes:
https://github.com/OpenMaterialGraph/OpenMaterialGraph
Node specifications here:
https://github.com/OpenMaterialGraph/OpenMaterialGraph/tree/master/spec/nodes
Minimalist BSDFs specifications here:
https://github.com/OpenMaterialGraph/OpenMaterialGraph/tree/master/spec/bsdfs
This is oriented towards Physically-based rendering though.
My feeling is that one needs to have a shell of a shader for ThreeJS that is higher level than a raw shader but lower level than Phong shader. Basically the default shader would be able to do morph targets, bones, etc. And then all these specific lighting scheme shaders (Basic, Lambert, Phong, Standard) would use that template. Right now there is an implied template shader (we include the same things in each shader) -- but I think we could make it clearer where to plug in things. You could plug in lighting schemes (phong, lambert, basic, or multilayered) and you can plug in properties to those lighting schemes which are your general nodes.
Hi.
I started developing of a THREE.NodeMaterial to reconcile the differences materials between 3D authoring software. In SEA3D Studio has options to create layers in Albedo with mask and various blend modes, Rim shader and others without need custom shader code. I would like to bring this to Three.JS with node shader.
I think that MeshPhongMaterial, MeshPhysicalMaterial and others can easily be based on NodeMaterial through of a interface for backward compatibility or proxy only.
UPDATED http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_materials_nodes.html http://sunag.github.io/sea3d/Labs/Three.JS-NodeMaterial/webgl_postprocessing_nodes.html
Syntax example for uses UV1 or UV2 for texture:
I am making an editor too, currently this would be the interface. Color is albedo and transform is the vertex position.
I am also taking care that it can be used in a deferred shading. Now I will create reflection and refraction inputs.
Will be sharing the news to the PR, suggestions, tests and enhancement are welcome :+1: