Open hybridherbst opened 3 months ago
This isn't a complete answer but at least a start.
- Is WebGPURenderer mature enough to start switching to it?
That really depends on the use case but in general no. We need a couple of more releases to add missing features of WebGLRenderer
and fix some known bugs and performance issues.
- Is WebGPURenderer with { forceWebGL: true } currently 100% matching WebGLRenderer?
No.
It will never be a 100% match since WebGL/GLSL specific classes like ShaderMaterial
, RawShaderMaterial
or GLBufferAttribute
won't be supported. Besides, certain features might be implemented differently e.g. how post-processing or clipping works or how a mirror (reflector) is defined.
- Are there specific things to keep in mind when using GLTFLoader with WebGPURenderer? Is there anything extra that a developer needs to do? Does it use nodes when doing that, or the previous materials?
WebGPURenderer
can process existing material configurations except for ShaderMaterial
and RawShaderMaterial
. So the renderer understands MeshStandardMaterial
, MeshPhysicalMaterial
and MeshBasicMaterial
(which is used by GLTFLoader
) but of course it internally uses the new material system for rendering.
- Will WebGPURenderer move to core at some point, or will it stay an example?
In dev
it is already in the core. With the next release r167
there will be separate builds with WebGPURenderer
and the node material (three.webgpu.js
and three.webgpu.min.js
). These new builds do not include WebGLRenderer
.
- Is there documentation for switching? For example, it's not clear to me if all classes WebGL* need to be swapped, or if it's fine to use WebGLRenderTarget with WebGPURenderer.
Not yet. IMO, WebGPURenderer
isn't yet at the state where we can recommend the broad community to switch over. When we are confident about that, there will be a migration guide.
Thank you so much for taking a stab at it!
I have some follow-up questions:
I would like to start using TSL and nodes. A couple releases earlier (r162), I was able to do so from WebGLRenderer
. Now, I can't figure out how, and all examples disappeared. How can I use nodes again while staying on the recommended and stable WebGLRenderer
?
Is there a path already for WebXR
support in WebGPURenderer
? ("path" as in: someone is interested in taking it on for three.js, spec is ready, ...). I couldn't find examples, so I assume it hasn't been started yet.
renderer.xr
only has { enabled: boolean }
at the moment, getSession()
, getCamera()
, ... have not been added yet.Thanks again
- I would like to start using TSL and nodes. A couple releases earlier (r162), I was able to do so from WebGLRenderer. Now, I can't figure out how, and all examples disappeared. How can I use nodes again while staying on the recommended and stable WebGLRenderer?
This isn't possible. TSL and nodes only work with WebGPURenderer
since a meaningful integration in WebGLRenderer
wasn't possible.
- Is there a path already for WebXR support in WebGPURenderer? ("path" as in: someone is interested in taking it on for three.js, spec is ready, ...). I couldn't find examples, so I assume it hasn't been started yet.
This task has not been started yet.
This isn't possible
This is not really clear to me. It was possible up until r163. What has changed? I understand that some nodes can't be supported, but it was definitely possible to construct node graphs, use MaterialXLoader, ... from WebGLRenderer
.
Well, with "isn't possible" I mean that the effort is just too high to integrate and maintain the node material in its current state into WebGLRenderer
. Ideally, we can move away from WebGLRenderer
as fast as possible but that means we have to limit what we want to support in WebGLRenderer
and focus on WebGPURenderer
instead.
I have some follow-up questions regarding documentation that I think can live in this issue as well.
three/webgpu
deprecates ShaderChunk
, ShaderLib
, UniformsLib
and other WebGL-related code paths.
I would like to understand what the current thinking is around upgrade paths for WebGPURenderer
.
Here are a few examples where I would welcome some insights on how this will conceptually work (or how it works today?):
Custom tonemapping Custom tonemapping so far worked by patching ShaderChunk.tonemapping_pars_fragment. https://github.com/mrdoob/three.js/blob/817a222f2d12baf44a38baf256bc9d4f65d82465/examples/webgl_tonemapping.html#L76 Conceptually, I think in WebGPURenderer the equivalent would be to replace some tonemapping node (?) in all other shaders that use that tonemapping node. But I don't know how. Especially when it's supposed to be done scene-wide (and not on an individual material level).
Custom lighting and shadow handling Similarly, ShaderChunks are patched for PCSS shadows. https://github.com/mrdoob/three.js/blob/e4a6fd8c1ec13d541a679a58ca25428e676e0d4a/examples/webgl_shadowmap_pcss.html#L242 I'm not sure what the node-based upgrade path here is. It seems there is an example for custom lighting models but that is not quite the same. Modifying the lighting scene-wide (and not on an invididual material level) is also not quite clear to me.
Per-material uniforms The combination of material.onBeforeCompile, UniformsLib, and ShaderChunk allowed to make and use custom uniforms for objects that could be modified per frame. I'm not sure what the node-based upgrade path here is. I found https://github.com/mrdoob/three.js/blob/master/examples/webgpu_instance_uniform.html which seems to cover some of that.
I, too, would like to see a working code snippet showing how to assign a custom ToneMappingNode
to the renderer -- one that duplicates Reinhard, for example.
It is not possible to do that. There is no Renderer.toneMappingNode
yet.
In the meanwhile, custom tone mapping can be achieved via post processing like so: https://jsfiddle.net/tw1kr7ox/
The fiddle duplicates the Uncharted 2 tone mapping from webgl_tonemapping
. The relevant bits are:
postProcessing = new THREE.PostProcessing( renderer );
postProcessing.outputColorTransform = false; // disable default output tone mapping and color space conversion
const scenePass = pass( scene, camera );
postProcessing.outputNode = vec4( CustomToneMappingNode( scenePass.rgb, renderer.toneMappingExposure ), scenePass.a ).toOutputColorSpace();
@Mugen87 an understanding question, do I get it right that the WebGL fallback renderer does not have any Nodes support anymore? That was also removed?
@hybridherbst WebGL fallback renderer does have node support.
Example: https://threejs.org/examples/?q=material%20loader#webgpu_loader_materialx
Thanks! I think I keep being confused by what belongs to what.
Could you confirm that my understanding of the folder structure is correct now:
WebGPURenderer.js and WebGPURenderer.Nodes.js are the "actual" renderer
WebGPUBackend.js (in the same folder) and WebGLBackend.js (in the webgl-fallback folder) are the backends for WebGPURenderer
webgpu/nodes/BasicNodeLibrary.js
and /StandardNodeLibrary.js
are used for both WebGLBackend and WebGPUBackend
the other files there, webgpu/nodes/WGSL*
, are only used for WebGPUBackend, while webgl-fallback/nodes/*
is only used for WebGLBackend
Description
I'm trying to understand what the current state is, where examples went, and how things in regards to nodes are done now. This is a documentation issue :)
From what I can see:
All WebGL examples regarding node materials have been removed in https://github.com/mrdoob/three.js/pull/28167.
New features and bug fixes to the nodes system are only added to the WebGPURenderer. No nodes-related features are officially available for WebGLRenderer anymore.
WebGPURenderer is currently inWebGPURenderer is now in a separate bundle.examples
. It's not part of core three.js.There is no documentation (that I could find) on how to switch from
WebGLRenderer
toWebGPURenderer
. I was wondering if switching with{ forceWebGL: true }
is a safe step, for example.Documentation Questions
Is
WebGPURenderer
mature enough to start switching to it?Is
WebGPURenderer
with{ forceWebGL: true }
currently 100% matchingWebGLRenderer
?Are there specific things to keep in mind when using
GLTFLoader
withWebGPURenderer
? Is there anything extra that a developer needs to do? Does it use nodes when doing that, or the previous materials?Will WebGPURenderer move to core at some point, or will it stay an example?
Is there documentation for switching? For example, it's not clear to me if all classes
WebGL*
need to be swapped, or if it's fine to useWebGLRenderTarget
withWebGPURenderer
.Solution
Documentation for how to start using
WebGPURenderer
.Alternatives
Waiting until a further point and not starting to use/test
WebGPURenderer
.Additional context
No response