xml3d / xml3d.js

The WebGL/JS implementation of XML3D
Other
75 stars 25 forks source link

Render depth buffer to texture #188

Open Arsakes opened 8 years ago

Arsakes commented 8 years ago

Let's consider customRenderTree example provided on wiki. I wanted to modify it and allow rendering depth buffer into texture aswell.

When I'm setting deprthAsRenderbuffer to false, backbuffer.depthTarget.isTexture is false, and XML throws an error when trying to use this buffer as a texture (I've checked in XML3D code that with this option set to false such texture is supposed to be created.)

Code is following

var backBuffer = this.renderInterface.createRenderTarget({
                width: context.canvasTarget.width,
                height: context.canvasTarget.height,
                colorFormat: context.gl.RGBA,
                depthFormat: context.gl.DEPTH_COMPONENT_16,
                depthAsRenderbuffer: false,
                stencilFormat: null
            });
console.log(backBuffer.depthTarget.isTexture) // FALSE should be TRUE according to xml3d code
csvurt commented 8 years ago

Hi,

sorry for the delay, I was away on vacation.

Part of the problem is that WebGL doesn't accept depth textures without activating the WebGL_depth_texture extension. Add this line to the initialization of your RenderTree:

var depthTextureExt = context.gl.getExtension("WEBGL_depth_texture");

You should also check to make sure depthTextureExt is not null, if it is then the device you're on doesn't support this extension.

The other problem is that we internally use gl.FLOAT as the format type to create depth textures. This is left over from the days when the floating point texture extensions in WebGL were working fairly well, which isn't the case anymore. Right now this isn't configurable from outside so you'll have to change the affected lines in src/renderer/webgl/base/rendertarget.js. Just search for gl.FLOAT and replace with gl.UNSIGNED_SHORT, then create your depth texture like so:

var backBuffer = this.renderInterface.createRenderTarget({
                width: context.canvasTarget.width,
                height: context.canvasTarget.height,
                colorFormat: context.gl.RGBA,
                depthFormat: context.gl.DEPTH_COMPONENT,
                depthAsRenderbuffer: false,
                stencilFormat: null
            });

backBuffer.depthTarget.isTexture will still show false until the target is actually created during the next frame render, then it should show true afterward.

Note that you'll be limited to 256 values in the depth texture (since there are no floating point textures), so accuracy could be a problem. You could try to get floating point textures working again but this depends more on the current status of the relevant WebGL extensions than on XML3D, I haven't looked into it in a while.

Hope that helps.