Open vanruesc opened 1 year ago
The composer provides a DepthTexture to passes that need one, but this feature is partially broken and too limited.
Just want to emphasize my excitement for changes here in particular -- missing/glitchy depth textures has been a major pain point for me over the past couple months and it will be awesome to have it smoothed out a bit.
Another one is the amount of variables and methods in pp marked private
that I've had to reach into (many of which are also used in the example demos) which can get quite ugly in a fully-typed codebase. I think more conservative use of private
variables would be worth considering.
Another one is the amount of variables and methods in pp marked private that I've had to reach into (many of which are also used in the example demos)
The demos are outdated in that regard. Most settings have been made available through getters/setters in the past - the new manual contains up-to-date examples, but it's still incomplete and will be published as part of v7.
This is an overview of the planned changes for v7. The actual implementation details are subject to change during development.
Feel free to leave feedback, questions or suggestions.
The Current Design
The API in
postprocessing
is based on three's postprocessing examples; at its core it provides anEffectComposer
that uses aWebGLRenderer
to render passes. APass
performs a set of tasks to either render consumable textures or to draw the final result to screen. TheRenderPass
renders the scene colors to a texture and serves as a starting point for most render pipelines.A few years ago, #82 introduced the
EffectPass
which can mergeEffect
instances into a single fullscreen pass for optimal shader performance. Since then, passes and effects have been added and improved, but some effects like SSR and motion blur are still missing. This is because the current API doesn't provide a good way to implement such effects efficiently. Modern effects need additional scene geometry data such as depth, normals, positions, roughness, velocity, etc. This currently requires the scene to be rendered multiple times using three's limited material override system.Problems
EffectComposer
creates two internal render targets to store intermediate results.render
method of the passes.multisampling
(MSAA) setting affects both of these buffers because the composer doesn't know which of them will actually be used by aRenderPass
.DepthTexture
to passes that need one, but this feature is partially broken and too limited.Implementation Goals
The buffer management in
postprocessing
needs to become more sophisticated to support modern requirements.RenderPass
toGeometryPass
.EffectComposer
with a more lightweightRenderPipeline
class.ClearPass
, aGeometryPass
and one or moreEffectPass
instances.GeometryPass
in a pipeline will be considered the main pass (regarding the main scene & camera).input
andoutput
resources.input
will includeuniforms
andtextures
(aliasbuffers
).output
will includeuniforms
andrenderTargets
(aliasbuffers
).Texture
orWebGLRenderTarget
.GBuffer
(string enum).Input
andOutput
both defineBUFFER_DEFAULT
which will be used to auto connect passes.GeometryPass
with MRT.BufferManager
(shared private static instance inRenderPipeline
).renderToScreen
andneedsSwap
will be removed.null
) ifpipeline.autoRenderToScreen
istrue
(default).The general IO concept is similar to other node-based systems like Blender's shader nodes which allow users to define named inputs and outputs. Three's built-in materials must be modified with
onBeforeCompile
to use MRT effectively (possibly the biggest challenge). Since MRT requires WebGL 2, effects that make use of the GBuffer may use GLSL 300.Use Case Examples
Common Setup
```ts import { ... } from "three"; import { ... } from "postprocessing"; const renderer = ...; const scene = ...; const camera = ...; const pipeline = new RenderPipeline(renderer); pipeline.addPass(new ClearPass()); pipeline.addPass(new GeometryPass(scene, camera, { frameBufferType: HalfFloatType, samples: 4 })); pipeline.addPass(new EffectPass(new BloomEffect(), ...)); requestAnimationFrame(function render(timestamp: number): void { requestAnimationFrame(render); pipeline.render(timestamp); }); ```
Multiple Scenes
The first `GeometryPass` in a pipeline produces the GBuffer. Other `GeometryPass` instances in the same pipeline render to the same GBuffer. To render to separate GBuffers, multiple pipelines must be created. ```ts const mainPass = new GeometryPass(scene, camera, { frameBufferType: HalfFloatType, samples: 4 })); const hudPass = new GeometryPass(hudScene, hudCamera)); const effectPass = new EffectPass(new BloomEffect(), ...); ``` ```ts pipeline.addPass(new ClearPass()); pipeline.addPass(mainPass); pipeline.addPass(hudPass); // Renders to the same buffer as mainPass by default. pipeline.addPass(effectPass); ``` ```ts // defaultBuffer is an alias for output.buffers.get(Output.BUFFER_DEFAULT) hudPass.output.defaultBuffer = effectPass.output.defaultBuffer; pipeline.addPass(new ClearPass()); pipeline.addPass(mainPass); pipeline.addPass(effectPass); pipeline.addPass(hudPass); // Renders to the same buffer as effectPass. ``` ```ts const pipelineA = new RenderPipeline(renderer); const pipelineB = new RenderPipeline(renderer); const geoPassA = new GeometryPass(sceneA, cameraA, { samples: 4 })); const geoPassB = new GeometryPass(sceneB, cameraB, { samples: 4 })); const blendEffect = new TextureEffect({ texture: geoPassA.output.defaultBuffer.texture }); pipelineA.addPass(new ClearPass()); pipelineA.addPass(geoPassA); pipelineB.addPass(new ClearPass()); pipelineB.addPass(geoPassB); pipelineB.addPass(new EffectPass(blendEffect, ...)); ```
IO Management
```ts class ExamplePass extends Pass { // Temporary buffers are outputs with private names. // Buffer names will be prefixed internally to avoid collisions. private static BUFFER_TMP_0 = "buffer.tmp0"; private static BUFFER_TMP_1 = "buffer.tmp1"; constructor() { super(); this.input.buffers.set(ExampleEffect.BUFFER_TMP_0, null); this.input.buffers.set(ExampleEffect.BUFFER_TMP_1, null); // input.defaultBuffer will automatically be set to previousPass.output.defaultBuffer.texture this.output.defaultBuffer = new WebGLRenderTarget(...); this.output.buffers.set(ExamplePass.BUFFER_TMP_0, new WebGLRenderTarget(...)); this.output.buffers.set(ExamplePass.BUFFER_TMP_1, new WebGLRenderTarget(...)); ... } protected override onInputChange(): void { this.copyMaterial.inputBuffer = this.input.buffers.get(ExampleEffect.BUFFER_TMP_1); } override onResolutionChange(resolution: Resolution): void { const { width, height } = resolution; this.output.buffers.get(BUFFER_TMP_0).setSize(width, height); this.output.buffers.get(BUFFER_TMP_1).setSize(width, height); this.output.setChanged(); } render(): void { const { renderer, output } = this; this.fullscreenMaterial = this.customMaterial; this.customMaterial.inputBuffer = this.input.defaultBuffer; renderer.setRenderTarget(output.buffers.get(ExamplePass.BUFFER_TMP_0)); this.renderFullscreen(); this.customMaterial.inputBuffer = this.input.buffers.get(ExampleEffect.BUFFER_TMP_0); renderer.setRenderTarget(output.buffers.get(ExamplePass.BUFFER_TMP_1)); this.renderFullscreen(); this.fullscreenMaterial = this.copyMaterial; renderer.setRenderTarget(output.defaultBuffer); this.renderFullscreen(); } } ```
GBuffer Usage
```ts class ExampleEffect extends Effect { private static BUFFER_TMP = "buffer.tmp"; constructor() { super(); this.fragmentShader = fragmentShader; this.uniforms.set(..., ...); this.input.buffers.set(GBuffer.DEPTH, null); this.input.buffers.set(GBuffer.NORMAL, null); // GeometryPass provides optimization options for things like normal-depth downsampling. //this.input.buffers.set(GBuffer.NORMAL_DEPTH, null); this.input.buffers.set(ExampleEffect.BUFFER_TMP, null); // Note: Using the default output buffer in an Effect would result in an error. this.output.buffers.set(ExampleEffect.BUFFER_TMP, new WebGLRenderTarget(1, 1, { depthBuffer: false })); this.exampleMaterial = ...; } protected override onInputChange(): void { // Refresh uniforms... const buffers = this.input.buffers; this.exampleMaterial.depthBuffer = this.input.buffers.get(GBuffer.DEPTH); this.exampleMaterial.normalBuffer = this.input.buffers.get(GBuffer.NORMAL); this.uniforms.get("exampleBuffer").value = buffers.get(ExampleEffect.BUFFER_TMP); } ... } ```
Effect Shader Changes
Shader Function Signatures
Fragment Shader
Vertex Shader
Geometry Data
Effects have access to the geometry data of the current fragment via the
data
parameter of themainImage
function. TheEffectPass
detects whether an effect reads a value from this struct and only fetches the relevant data from the respective textures when it's actually needed. Sampling depth at another coordinate can be done viafloat readDepth(in vec2 uv)
. To calculate the view Z based on depth, the functionfloat getViewZ(in float depth)
can be used.GData
is defined as follows:Uniforms, Macros and Varyings
All shaders have access to the following uniforms:
The fragment shader has access to the following additional uniforms:
The following varyings are reserved:
Available vertex attributes:
Available macros:
PerspectiveCamera
, the macroPERSPECTIVE_CAMERA
will be defined.FRAMEBUFFER_PRECISION_HIGH
will be defined.