pmndrs / postprocessing

A post processing library for three.js.
zlib License
2.25k stars 206 forks source link

Render Pipeline Redesign #419

Open vanruesc opened 1 year ago

vanruesc commented 1 year ago

This is an overview of the planned changes for v7. The actual implementation details are subject to change during development.

Feel free to leave feedback, questions or suggestions.

The Current Design

The API in postprocessing is based on three's postprocessing examples; at its core it provides an EffectComposer that uses a WebGLRenderer to render passes. A Pass performs a set of tasks to either render consumable textures or to draw the final result to screen. The RenderPass renders the scene colors to a texture and serves as a starting point for most render pipelines.

A few years ago, #82 introduced the EffectPass which can merge Effect instances into a single fullscreen pass for optimal shader performance. Since then, passes and effects have been added and improved, but some effects like SSR and motion blur are still missing. This is because the current API doesn't provide a good way to implement such effects efficiently. Modern effects need additional scene geometry data such as depth, normals, positions, roughness, velocity, etc. This currently requires the scene to be rendered multiple times using three's limited material override system.

Problems

Implementation Goals

The buffer management in postprocessing needs to become more sophisticated to support modern requirements.

The general IO concept is similar to other node-based systems like Blender's shader nodes which allow users to define named inputs and outputs. Three's built-in materials must be modified with onBeforeCompile to use MRT effectively (possibly the biggest challenge). Since MRT requires WebGL 2, effects that make use of the GBuffer may use GLSL 300.

Use Case Examples

Common Setup

```ts import { ... } from "three"; import { ... } from "postprocessing"; const renderer = ...; const scene = ...; const camera = ...; const pipeline = new RenderPipeline(renderer); pipeline.addPass(new ClearPass()); pipeline.addPass(new GeometryPass(scene, camera, { frameBufferType: HalfFloatType, samples: 4 })); pipeline.addPass(new EffectPass(new BloomEffect(), ...)); requestAnimationFrame(function render(timestamp: number): void { requestAnimationFrame(render); pipeline.render(timestamp); }); ```

Multiple Scenes

The first `GeometryPass` in a pipeline produces the GBuffer. Other `GeometryPass` instances in the same pipeline render to the same GBuffer. To render to separate GBuffers, multiple pipelines must be created. ```ts const mainPass = new GeometryPass(scene, camera, { frameBufferType: HalfFloatType, samples: 4 })); const hudPass = new GeometryPass(hudScene, hudCamera)); const effectPass = new EffectPass(new BloomEffect(), ...); ``` ```ts pipeline.addPass(new ClearPass()); pipeline.addPass(mainPass); pipeline.addPass(hudPass); // Renders to the same buffer as mainPass by default. pipeline.addPass(effectPass); ``` ```ts // defaultBuffer is an alias for output.buffers.get(Output.BUFFER_DEFAULT) hudPass.output.defaultBuffer = effectPass.output.defaultBuffer; pipeline.addPass(new ClearPass()); pipeline.addPass(mainPass); pipeline.addPass(effectPass); pipeline.addPass(hudPass); // Renders to the same buffer as effectPass. ``` ```ts const pipelineA = new RenderPipeline(renderer); const pipelineB = new RenderPipeline(renderer); const geoPassA = new GeometryPass(sceneA, cameraA, { samples: 4 })); const geoPassB = new GeometryPass(sceneB, cameraB, { samples: 4 })); const blendEffect = new TextureEffect({ texture: geoPassA.output.defaultBuffer.texture }); pipelineA.addPass(new ClearPass()); pipelineA.addPass(geoPassA); pipelineB.addPass(new ClearPass()); pipelineB.addPass(geoPassB); pipelineB.addPass(new EffectPass(blendEffect, ...)); ```

IO Management

```ts class ExamplePass extends Pass { // Temporary buffers are outputs with private names. // Buffer names will be prefixed internally to avoid collisions. private static BUFFER_TMP_0 = "buffer.tmp0"; private static BUFFER_TMP_1 = "buffer.tmp1"; constructor() { super(); this.input.buffers.set(ExampleEffect.BUFFER_TMP_0, null); this.input.buffers.set(ExampleEffect.BUFFER_TMP_1, null); // input.defaultBuffer will automatically be set to previousPass.output.defaultBuffer.texture this.output.defaultBuffer = new WebGLRenderTarget(...); this.output.buffers.set(ExamplePass.BUFFER_TMP_0, new WebGLRenderTarget(...)); this.output.buffers.set(ExamplePass.BUFFER_TMP_1, new WebGLRenderTarget(...)); ... } protected override onInputChange(): void { this.copyMaterial.inputBuffer = this.input.buffers.get(ExampleEffect.BUFFER_TMP_1); } override onResolutionChange(resolution: Resolution): void { const { width, height } = resolution; this.output.buffers.get(BUFFER_TMP_0).setSize(width, height); this.output.buffers.get(BUFFER_TMP_1).setSize(width, height); this.output.setChanged(); } render(): void { const { renderer, output } = this; this.fullscreenMaterial = this.customMaterial; this.customMaterial.inputBuffer = this.input.defaultBuffer; renderer.setRenderTarget(output.buffers.get(ExamplePass.BUFFER_TMP_0)); this.renderFullscreen(); this.customMaterial.inputBuffer = this.input.buffers.get(ExampleEffect.BUFFER_TMP_0); renderer.setRenderTarget(output.buffers.get(ExamplePass.BUFFER_TMP_1)); this.renderFullscreen(); this.fullscreenMaterial = this.copyMaterial; renderer.setRenderTarget(output.defaultBuffer); this.renderFullscreen(); } } ```

GBuffer Usage

```ts class ExampleEffect extends Effect { private static BUFFER_TMP = "buffer.tmp"; constructor() { super(); this.fragmentShader = fragmentShader; this.uniforms.set(..., ...); this.input.buffers.set(GBuffer.DEPTH, null); this.input.buffers.set(GBuffer.NORMAL, null); // GeometryPass provides optimization options for things like normal-depth downsampling. //this.input.buffers.set(GBuffer.NORMAL_DEPTH, null); this.input.buffers.set(ExampleEffect.BUFFER_TMP, null); // Note: Using the default output buffer in an Effect would result in an error. this.output.buffers.set(ExampleEffect.BUFFER_TMP, new WebGLRenderTarget(1, 1, { depthBuffer: false })); this.exampleMaterial = ...; } protected override onInputChange(): void { // Refresh uniforms... const buffers = this.input.buffers; this.exampleMaterial.depthBuffer = this.input.buffers.get(GBuffer.DEPTH); this.exampleMaterial.normalBuffer = this.input.buffers.get(GBuffer.NORMAL); this.uniforms.get("exampleBuffer").value = buffers.get(ExampleEffect.BUFFER_TMP); } ... } ```

Effect Shader Changes

Shader Function Signatures

Fragment Shader

vec4 mainImage(in vec4 inputColor, in vec2 uv, in GData data);
void mainUv(inout vec2 uv);

Vertex Shader

void mainSupport(in vec2 uv);

Geometry Data

Effects have access to the geometry data of the current fragment via the data parameter of the mainImage function. The EffectPass detects whether an effect reads a value from this struct and only fetches the relevant data from the respective textures when it's actually needed. Sampling depth at another coordinate can be done via float readDepth(in vec2 uv). To calculate the view Z based on depth, the function float getViewZ(in float depth) can be used. GData is defined as follows:

struct GData {
    vec3 position;
    vec3 normal;
    float depth;
    float roughness;
    float metalness;
    float luminance;
}

Uniforms, Macros and Varyings

All shaders have access to the following uniforms:

uniform vec4 resolution; // screen resolution (xy), texel size (zw)
uniform vec3 cameraParams; // near, far, aspect
uniform float time;

The fragment shader has access to the following additional uniforms:

// Availability of actual buffers depends on the input configuration.
struct GBuffer {
    sampler2D color;
    sampler2D position;
    sampler2D depth;
    sampler2D normal;
    sampler2D normalDepth;
    sampler2D roughnessMetalness;
}

uniform GBuffer gBuffer;

The following varyings are reserved:

varying vec2 vUv;

Available vertex attributes:

attribute vec3 position;

Available macros:

braebo commented 1 year ago

The composer provides a DepthTexture to passes that need one, but this feature is partially broken and too limited.

Just want to emphasize my excitement for changes here in particular -- missing/glitchy depth textures has been a major pain point for me over the past couple months and it will be awesome to have it smoothed out a bit.

Another one is the amount of variables and methods in pp marked private that I've had to reach into (many of which are also used in the example demos) which can get quite ugly in a fully-typed codebase. I think more conservative use of private variables would be worth considering.

vanruesc commented 1 year ago

Another one is the amount of variables and methods in pp marked private that I've had to reach into (many of which are also used in the example demos)

The demos are outdated in that regard. Most settings have been made available through getters/setters in the past - the new manual contains up-to-date examples, but it's still incomplete and will be published as part of v7.