mrdoob / three.js

JavaScript 3D Library.
https://threejs.org/
MIT License
102.63k stars 35.37k forks source link

DepthTexture result in errors when assigned to a wgslFn #27502

Closed Spiri0 closed 5 months ago

Spiri0 commented 10 months ago

Description

The point will probably end up with you anyway sunag, but I didn't want to write to you specifically every time. If things go like they did in the last six months, WebGPU will be well integrated to a large extent this year 😊 But I really need all of these functions in a single project. Getting water photorealistically requires a very large range of functions.

Ok, to the issue: I create a wgsl shader with the wgslFn node and assign it to the colorNode. The shader works. In three.js I create a THREE.DepthTexture() and a cubeTexture using the THREE.CubeTextureLoader(). I pass these to the parameters for the shader. As soon as I try to integrate these textures with the wgsl commands in the shader, I get these errors and endless warnings:

DepthTexture error:
WebGPUBindingUtils.js:134 Uncaught (in promise) TypeError: Failed to execute 'writeBuffer' on 'GPUQueue': parameter 1 is not of type 'GPUBuffer'.
    at WebGPUBindingUtils.updateBinding (WebGPUBindingUtils.js:134:16)
    at WebGPUBackend.updateBinding (WebGPUBackend.js:1013:21)
    at Bindings._update (Bindings.js:121:14)
    at Bindings.updateForRender (Bindings.js:73:8)
    at WebGPURenderer._renderObjectDirect (Renderer.js:1020:18)
    at WebGPURenderer.renderObject (Renderer.js:985:9)
    at WebGPURenderer._renderObjects (Renderer.js:935:10)
    at WebGPURenderer.render (Renderer.js:336:40)
    at ThreeJSController.Render (threejs-component.js:94:18)
    at main.js:228:31

CubeTexture warnings:
The shader's binding dimension (TextureViewDimension::e2D) doesn't match the shader's binding dimension (TextureViewDimension::Cube).
 - While validating that the entry-point's declaration for @group(0) @binding(30) matches [BindGroupLayout]
 - While validating the entry-point's compatibility for group 0 with [BindGroupLayout]
 - While validating fragment stage ([ShaderModule "fragment"], entryPoint: main).
 - While validating fragment state.
 - While calling [Device].CreateRenderPipeline([RenderPipelineDescriptor]).

250[Invalid RenderPipeline] is invalid.
 - While encoding [RenderPassEncoder].SetPipeline([Invalid RenderPipeline]).

249[Invalid CommandBuffer from CommandEncoder "renderContext_0"] is invalid.
 - While calling [Queue].Submit([[Invalid CommandBuffer from CommandEncoder "renderContext_0"]])

Reproduction steps

Create a cubeTexture and a DepthTexture

Create a simple wgsl shader and assign the depthTexture and the cubeTexture

Errors and endless warnings appear in the console.

Code

function init(){

  const cubeTextureLoader = new THREE.CubeTextureLoader();
  cubeTextureLoader.setPath('./resources/textures/cube/');
  const cubeTexture = cubeTextureLoader.load([
    'px.png', 'nx.png',
    'py.png', 'ny.png',
    'pz.png', 'nz.png'
  ]);
  cubeTexture.minFilter = THREE.LinearFilter;
  cubeTexture.magFilter = THREE.LinearFilter;

  const depthTexture = new THREE.DepthTexture();
  depthTexture.type = THREE.FloatType;

  //renderTarget = new THREE.RenderTarget(window.innerWidth, window.innerHeight);
  //renderTarget.depthTexture = depthTexture;

  //Without depth and envmap in the shader, the code runs and you see the red cube.
  //But the code also has to run with the two lines (depth: texture_depth_2d and envmap: texture_cube<f32>)
  const testWGSL = wgslFn(`
    fn testWGSL(
      depth: texture_depth_2d,
      envmap: texture_cube<f32>
    ) -> vec4<f32> {

      return vec4<f32>(1, 0, 0, 1);
    }
  `);

  const shaderParams = {
    depth: texture(depthTexture),
    envmap: texture(cubeTexture)
  }

  const material = new MeshBasicNodeMaterial();
  material.colorNode = testWGSL(shaderParams);

  const geometry = new THREE.BoxGeometry( 1, 1, 1 );
  const cube = new THREE.Mesh( geometry, material ); 
  scene.add( cube );
}

Live example

https://codepen.io/Spiri0/pen/wvReJKR?editors=1111

Unfortunately, the CodePen example is only of limited use. I don't see anything in the console. You can simply copy the code from the init and paste it into one of the webgpu examples to reproduce the issues. Add the corresponding nodes. I kept it as simple as possible.

Screenshots

No response

Version

r160

Device

Desktop

Browser

Chrome

OS

Windows

Spiri0 commented 9 months ago

@sunag: I confess that I have trouble understanding the three.js code. I would have to study it for a long time to understand it better.

Do you have new insights into what is causing the problems? I mean if you've been able to take a look at it in the meantime. I am aware that there are other issues, so I tried to understand it a little better myself and looked into the three.js code. But I'm only scratching the surface so far

sunag commented 9 months ago

@Spiri0 Have you tried using cubeTexture() instead of texture()?

const shaderParams = {
    depth: texture( depthTexture ),
    envmap: cubeTexture( cubeMap )
}
Spiri0 commented 9 months ago

Ok, the shader works with cubeTexture. I mistakenly thought texture was the node for all types of textures because I hadn't found a separate accessor for the depthTexture in Nodes.js when I looked after it. There is a node depthTexture but it is in "display" in Nodes.js. Then by analogy it would probably be good if there was a depthTexture node as an accessor. To do this, the node in the display would have to be renamed. I found the depthTexture node there a bit irritating back in December, as I would have assumed this node would also be among the accessors. Well, I'm really happy that the cubeTexture now works. Thank you Sunag. Then I'll rename this issue and remove the cubeTexture

Spiri0 commented 6 months ago

I think I have now found the cause of what leads to the errors and warnings when I want to pass a depthTexture to a shader. The depthTexture is only created with a renderTarget, before that it doesn't exist at all, even if you use

const depthTexture = new THREE.DepthTexture();

to create a depthTexture. The threejs command just declare the depthTexture, but not yet initialize it. This only happens with the first use of the renderTarget. The depthTexture only exists in the GPU after the first rendering with the renderTarget. Therefore, passing the depthTexture to a wgslFn if it has not yet been rendered into the GPU leads to the red error message:

WebGPUBindingUtils.js:134 Uncaught (in promise) TypeError: Failed to execute 'writeBuffer' on 'GPUQueue': parameter 1 is not of type 'GPUBuffer'.
    at ...

If the depthTexture is created once at the beginning before the meshBasicNodeMaterial then it works without the error message.

But if I want to update the depthTexture at every interval with the renderTarget then warnings appear in the console in each interval. The reason is that the renderTarget wants to write, but at the same time the meshBasicNodeMaterial with its wgslFn wants to read.

[Texture] usage (TextureUsage::
(TextureBinding|RenderAttachment)) includes writable
usage and another usage in the same synchronisation scope.
-While validating render pass usage.

[Invalid CommandBuffer from CommandEncoder
"renderContext_2"] is invalid.
- While calling [Queue].Submit([[Invalid CommandBuffer
from CommandEncoder "renderContext_2"]])

This reminds me of last year when I tried to read and write textures with the compute shaders for the first time. There were also conflicts with the bindings. For this problem you @sunag then introduced the “textureStore” node. I use this node heavily in my ocean repository. The "texture" node is the getter and the "textureStore" node is the setter in the many compute shaders I use. With this clean access separation, further components can then safely access the textures without causing conflicts while other systems write with the “textureStore” node. So the good news is that deep down in the wgslFn everything works exactly as it should. I tested that with the "DepthTexture-webGPU example"

//----------------------------------------------
//you can copy this code right after "quad = new QuadMesh( materialFX );"
//need to import the nodes(texture, attribute, wgslFn) at the top

const materialParams = {
  depthTexture: texture(depthTexture),
  depthSampler: texture(depthTexture),
  uv: attribute("uv")
}

const fragmentShader = wgslFn(`
  fn main_(
    depthTexture: texture_depth_2d,
    depthSampler: sampler,
    uv: vec2<f32>
  ) -> vec4<f32> {

    var depthValue = textureSample(depthTexture, depthSampler, uv);

    return vec4<f32>(depthValue);
  }
`);

const materialFX2 = new MeshBasicNodeMaterial();
materialFX2.colorNode = fragmentShader(materialParams);

quad = new QuadMesh(materialFX2);
//----------------------------------------------

After you integrated that with the RedIntegerTexture, I also saw the "texture_depth_2d" in the code and followed that and it all looked coherent. That's why I concentrated on testings with the renderTargets and that's exactly where the cause lies. In the case of the DepthTexture example it works, but due to the very simple example it's more of a happy coincidence that read and write don't get in each other's way.

This strong separation is necessary for the compute shaders and I think the render targets need this access separation just as much. The solution would be something like a textureStore in the renderTarget so that read/write conflicts no longer occur. I'm so happy to have finally narrowed down the cause. This point really bothered me.

Spiri0 commented 6 months ago

I put a computeShader directly downstream of the renderTarget in the updateloop. After the renderTarget, the computeShader then reads the depthTexture with the texture node and renders it with the textureStore node into a second texture, which I then read in with a wgslFn shader with the texture node. Since the computeShader only writes to this second texture with the textureStore node and the actual materialShader only reads with the texture node, there are no longer any error messages. This is of course a complicated method and not a solution. However, it completely validates the previous analysis.

//code snipped from my material class
//in a class init
init(params){
    this.params_ = params;

    this.depthTexture = new THREE.DepthTexture();
    this.depthTexture.type = THREE.FloatType;

    this.renderTarget = new THREE.RenderTarget(window.innerWidth, window.innerHeight);
    this.renderTarget.depthTexture = this.depthTexture;

    this.depthTexture2 = new THREE.StorageTexture(window.innerWidth, window.innerHeight);
    this.depthTexture2.type = THREE.FloatType;

    const shaderParams = {
        depthTexture: texture(this.depthTexture2),
        depthSampler: texture(this.depthTexture2),
        ...
    }

    this.material = new MeshBasicNodeMaterial();
    this.material.colorNode = fragmentShader(shaderParams);

    //test computeShader
    this.SetDepthTexture2 = wgslFn(`
        fn compute(
            writeTex: texture_storage_2d<rgba32float, write>,     //seems r32float not working so i use a rgba32float
            readTex: texture_depth_2d,
            index: u32
        ) -> void {
            var textureSize = textureDimensions(readTex);
            var x = index % textureSize.x;
            var y = index / textureSize.y;
            var idx = vec2u(x, y);

            var depth = textureLoad(readTex, idx, 0);
            textureStore(writeTex, idx, vec4<f32>(depth));
        }
    `);

}

//in a class update method:
update(){
    this.params_.renderer.setRenderTarget(this.renderTarget);
    this.params_.renderer.render(this.params_.scene, this.params_.camera);
    this.params_.renderer.setRenderTarget(null);

    //render depthTexture into depthTexture2 to avoid conflicts between renderTarget and this.material
    this.computeDepth = this.SetDepthTexture2(
        index: instanceIndex,
        readTex: texture(this.depthTexture),
        writeTex: textureStore(this.depthTexture2),
    ).compute(window.innerWidth * window.innerHeight);
    this.params_.renderer.compute(this.computeDepth);
}

The problem not only affects the depthTexture but also the interaction of renderTarget textures and material shaders in general. Maybe there is an option to do without the renderTarget and to render the scene directly into textures and depth textures using compute shaders and the renderer. But I can't assess that. In any case, the investigation part is completed. Maybe a little over-detailed but forgive me, that's the physicist in me. QED

sunag commented 5 months ago

Solved https://github.com/mrdoob/three.js/pull/28568#issuecomment-2156093937