Open davepagurek opened 1 year ago
I agree that this would be great for general shader use and for #6276
I've started making a proof-of-concept system for making a shader graph out of snippets. There's some explanations in the readme here: https://github.com/davepagurek/shader-builder
The things I'm hoping to address with the prototype:
struct
types to allow multiple return values from one snippet function)Some things I still want to think about/tinker with:
Let me know if anyone has thoughts so far!
I want to review/rework how addon libraries work to a certain extend as part of the investigations I'm doing. Might be worth thinking about how this fits in there as well.
Definitely! What aspects of library building are you thinking about currently?
I'm thinking whether to change how addons are authored for p5.js where currently the general advice is to attach methods directly to the p5
object's prototype (which is all good, jQuery for example does the same) or to also provide a utility function that can help register relevant things such as events or built in hooks (if and when we add them).
A set of minor pros and cons for the prototype approach is that for pro, it is easy to write, align with how internal modules work, and work with existing addons; a con is that addons need to be loaded before p5 initialization and features cannot be dynamically added when the runtime is running.
Also reviewing this as part of the docs review of creating_libraries.md I'm looking at. Some of the things might not be relevant but I'm still thinking.
Here's an update on this with some new thoughts!
For that reason, I think it makes sense to define a hooks api, inspired by the hooks in Luma's gaussian splat library. From the end user's point of view, you could augment a shader like this:
const myShader = p5.RendererGL.defaultFillShader.fillHooks({
uniforms: `uniform float time;`,
vertex: {
getLocalPosition: `(vec3 pos) {
pos.y += 20.0 * sin(time);
return pos;
}`
},
})
Internally, when we define the shader to have a hook, it'd look like this. In the shader source, use HOOK_hookName
as a function, and then provide a default implementation of the function under the hookName
key in a vertexHooks
settings object:
defaultFillShader = createShader(
`attribute vec3 position;
void main() {
vec3 localPosition = HOOK_getLocalPosition(aPosition);
// ...etc
}`
fragSrc, // omitted for brevity
{
vertexHooks: {
// For each hook, provide a default value
getLocalPosition: `(vec3 pos) { return pos; }`
}
}
)
(Optionally, for performance, we can also add a #define
when a hook is filled, so if a shader wants to ensure there are no extraneous function calls when the hooks aren't filled, they could do so with preprocessor directives.)
The hooks object is stored in the shader, and when a user calls fillHooks(...)
, the default hook snippets get replaced with the user's snippets. The full setHooks
signature would be:
type HooksOptions = {
// A string spliced into both shaders above `main`, e.g. for `uniform`s
declarations?: string
// Options for each shader individually
vertex: SingleHookOptions
fragment: SingleHookOptions
}
type SingleHookOptions = {
// A string spliced in before `main`, e.g. for `out` variables in the vertex and `in` variables in the fragment shader
declarations?: string
// Implementations of the other hooks defined by the shader
[hookName: string]: string
}
setHooks(options: HooksOptions): p5.Shader
When we compile the shader, we'd splice in a string with the hook definitions:
filledVertSrc() {
const main = 'void main';
const [preMain, postMain] = this._vertSrc.split(main);
let hooks = '';
if (this.hooks.declarations) {
hooks += this.hooks.declarations + '\n';
}
for (const hookName in this.hooks.vertex) {
if (hookName === 'declarations') {
hooks += this.hooks.vertex.declarations + '\n';
} else {
// Add a #define so that if the shader wants to use preprocessor directives to
// optimize away the extra function calls in main, it can do so
hooks += '#define HOOK_' + hookName
hooks += 'HOOK_' + hookName + this.hooks.vertex[hookName] + '\n';
}
}
return preMain + hooks + main + postMain;
}
As for the specific hooks to include, I think it'd be something like:
void beforeMain()
(e.g. to set some globals that other hook functions might use)vec3 getLocalPosition(vec3 position)
(called before multiplying the view matrix)vec3 getWorldPosition(vec3 position)
(called after multiplying the view matrix)vec3 getlocalNormal(vec3 normal)
(called before multiplying the view matrixvec3 getWorldNormal(vec3 normal)
(called after multiplying the view matrixvec2 getUV(vec2 uv)
vec4 getVertexColor(vec4 color)
void afterMain()
(e.g. to set values of any out
variables)void beforeMain()
vec4 getWorldNormal(vec3 normal)
(e.g. for bump mapping)vec4 getBaseColor(vec4 color)
vec4 getAmbientMaterial(vec4 color)
vec4 getSpecularMaterial(vec4 color)
float getShininess(float shininess)
vec4 combineColors(ColorComponents components)
(e.g. to multiply some components instead of adding, etc. We'll make a struct rather than passing in tons of positional args.)void afterMain()
float getThickness(float thickness)
vec2 getUV(vec2 uv)
vec4 getVertexColor(vec4 color)
getFinalColor(vec4 color)
@davepagurek For the hooks idea, I am exploring standardizing it for 2.0 as well. We currently have some hooks like beforeSetup
, afterSetup
, pre
, post
, etc that are available for library authors. I plan to have them renamed to something more descriptive.
For shader hooks, not sure how feasible it is but, would it be possible to have the WebGL renderer module extend this functionality so that additional shader hooks can be defined using similar syntax to default lifecycle hooks? The main goal is to reduce concept duplication where possible but if it doesn't make sense in this context we can think about how to prevent confusion instead.
The main thing unique about this shader scenario is that instead of functions, you'd supply GLSL strings. Other than that though, it seems feasible.
For p5 lifecycle hooks, those would look like this, right?
p5.registerAddon((p5, fn, lifecycles) => {
lifecycles.postdraw = function() {
// Run actions after `draw()` runs
};
})
Previously, I was thinking about that as a method on a default shader. I had initially suggested fillHooks
, but maybe something like augment
would work better? I had some hooks namespaced as vertex or fragment hooks, but that can always be done by the hook name instead. Flattened into just one object, that could look something like this:
const myShader = defaultShader.augment({
declarations: `uniform float timeScale;`,
getLocalPosition: `(vec3 pos) {
pos.y += 20.0 * sin(time * timeScale);
return pos;
}`
})
But if we use a callback function, we could use an assignment instead:
const myShader = defaultShader.augment((lifecycles) => {
lifecycles.declarations = `uniform float timeScale;`
lifecycles.getLocalPosition = `(vec3 pos) {
pos.y += 20.0 * sin(time * timeScale);
return pos;
}`
})
Do you think that interface gets close enough to the p5 lifecycle hooks for it to feel familiar?
Just thinking a bit out loud here, in term of API
p5.registerAddon((p5, fn, lifecycles) => {
// `webgl` to namespace things
lifecycles.webgl.defaultShader.augment = {
// ....
};
});
Although thinking about it now, is this meant to be a library author facing feature only or would it also be user facing? If it is user facing then the registerAddon
API probably isn't a route to go with but if it is, ideally it should go through registerAddon
one way or another.
The idea behind lifecycle
argument in the registerAddon
callback is that it works by assignment and it isn't directly assigning anything in the p5 internals but rather the callback function will prepopulate with a blank object
lifecycles = {};
and on each call of registerAddon
the attached properties to lifecycles
will be pushed to an internal array that keep track of each lifecycle. That way one library does not interfere with the lifecycle of another library and multiple libraries can register multiple actions attached to the same hook. In this shader context, I'm not sure if it make sense to work in the same way or later added augments will overwrite the previous, but maybe this can help guide the API design.
I think the issue with registerAddon
is that we'd need it to return a new object, since this is a sort of alternate way of constructing a shader. Since the other lifecycles are p5-global listeners or for adding new methods to p5 rather than returning an object, would that make it a bit too different from how the other ones work, or do you think there's a way to extend registerAddon
to be able to act like a constructor of sorts too?
is this meant to be a library author facing feature only or would it also be user facing?
Users would be the ones making their own shaders by filling out hooks, with the idea being that rather than writing full vertex + fragment shader source code, they could just write the part that interests them (e.g. just editing position to warp all the points in a mesh, or just editing the color if they want to make a generative texture.) Those shaders could be packaged as addons too, like calling myMaterial()
rather than shader(myShader)
, but I think a big part of the appeal is making it easier to teach shaders to everyone by not needing to explain everything about shaders at once in order to start using them.
Increasing Access
There have been a number of requests related to the material system in p5, such as adding fog (https://github.com/processing/p5.js/issues/5878) or blend modes for ambient light (https://github.com/processing/p5.js/issues/6123), I've been working on a library for shader-based warping (https://github.com/davepagurek/p5.warp), and a GSoC project this year will involve working on image-based lighting as an alternative to point/directional/spot lights.
We intentionally don't add every feature into p5 core in order to keep the codebase maintainable, keep the API simple for beginners, and keep the runtime reasonably fast. It would be great to allow community libraries to fill these needs instead! However, the system is currently very difficult to add to externally; the only viable option right now is to package a p5 shader and distribute that, which means keeping your shader up-to-date with internal changes.
A dedicated way to hook into the material system would help people who are interested in contributing via a library test out their ideas, and would give users a larger variety of tools for different needs as new libraries are added.
Most appropriate sub-area of p5.js?
Feature request details
The main difference between adding a material to p5 and writing a full shader is that for the former, you generally want to keep most of the existing shader. The best way to do that right now is copy-and-pasting, which goes stale over time and requires expertise of p5's internals to do in the first place. The design goal would be to allow people to replace specific parts of our shaders without needing to do that.
To narrow the scope, I think this only needs to apply to our fill material with lighting, not lines or text for now.
Some potential pieces a library might want to replace are:
Shader snippets
We can maybe think of our shaders as a collection of code snippets for both the fragment and vertex shader, which have two parts: a header (to specify inputs) and a body (which runs in
main()
), combined like this:If we break down our current shaders into snippets like that, then we could provide a minimal API for creating a new shader where one could replace just one part. Maybe something like:
Some downsides with this are the fact that it treats all snippets just as strings, so there may be naming collisions or type mismatches when making snippets work together. It would at least require relatively minimal code to implement, though.
Shader graph
There's this existing library for combining shader pieces to make one shader: https://www.npmjs.com/package/@gerhobbelt/shadergraph This does much of what the above snippet idea does, but in a much more complete but heavy way, where one can define snippets for small bits of code and build a complicated dependency graph to compile into a shader.
Using this benefits from not being built from scratch, but also adds a new dependency to p5, and means providing a more complicated API to library builders.
Providing access to default shader source
The barest-bones solution maybe just involves exposing the source code for our current material shaders via variables that libraries can reference. That way they could use our existing vertex shader but write their own fragment shader.
This doesn't solve the problem where one wants to use most of our lighting calculations (and therefore be able to integrate with point/directional/spot light calls made in p5) but would still be helpful.