Open arpu opened 4 years ago
@Mugen87 any suggestions how to start on this?
I'm afraid I've never worked with this use case so far so I'm not sure about this. But probably providing texture data for testing is a good start.
However, I'm not sure how such a provision would look like. Do existing texture compression formats support array of textures? Is it the task of a loader to provide an array of compressed textures? Or is this something the app has to compose?
the idea is to use compressed Textures for a Terrain splat map discussed with @gkjohnson in https://github.com/gkjohnson/threejs-sandbox/issues/9
using .basis texture compression with js Array on app Level
using .basis texture compression with js Array on app Level
How would it look like to compose multiple basis textures into a single Uint8Array
? This would be required to make to call of WebGL2RenderingContext.compressedTexImage3D()
.
Here's a demonstration of loading an existing texture as a texture2darray in the webgl2 samples repo: link. You can see that the jpg they load in is already composed of the three separate images that would correspond to each texture layer.
In the example they write the image to a canvas and read the pixels back as uint8 array before uploading it but according to the docs for texImage3D it should work with a canvas or image element, as well, so it's not clear why they didn't upload the image directly.
I would think a Texture2DArray
or CompressedTexture2DArray
class would expect that the provided image is already composed of the images in the array like the example above and otherwise it should be up to the user to generate an image like that. Perhaps there could be a helper class like Texture2DArrayGenerator
in the examples that can be used to generate a Texture2DArray
object from a list of textures by drawing them to a canvas. Just a few thoughts!
this is how it works now with uncompressed raw image data
const data = new Uint8Array(atlas.textures.length * 4 * 1024 * 1024);
for (let t = 0; t < atlas.textures.length; t++) {
const curTexture = atlas.textures[t];
const curData = _GetImageData(curTexture.image);
const offset = t * (4 * 1024 * 1024);
data.set(curData.data, offset);
}
const diffuse = new THREE.DataTexture2DArray(data, 1024, 1024, atlas.textures.length);
_GetImageData i data is the raw image from a canvas
Right so you can generate the uint8 buffer on the fly using the read pixels approach and use a DataTexture2DArray
for this use case. A Texture2DArray
or CompressedTexture2DArray
class would be more convenient and a bit faster considering reading pixel data from a canvas can be relatively slow, though.
lets see what @Mugen87 suggest.
Do existing texture compression formats support array of textures? Is it the task of a loader to provide an array of compressed textures?
The KTX2 (.ktx2
) format does support arrays of textures with Basis Universal compression types, although it's never used that way within glTF material texture slots. The .basis
format also does — same compression, different container. I'm not sure about older container formats.
Hi everyone, any news here?
As far as I understood, right now there's no way to initialize a DataTexture2DArray and add compressed textures in it on the fly, right? Or there some workaround?
Or there some workaround?
There is a WebGLRenderTarget.setTexture
function (demonstrated in the webgl2_rendertarget_texture2darray
example) which can be used to set the underlying storage representation of the render target. This mean you can render any type of texture including compressed textures into each texture array index and sample from them in the GPU. Of course it requires rendering all the compressed textures to a render target in the appropriate renderer context for it to work, first, but it is usable.
I posted an example of how I used the capability here:
https://twitter.com/garrettkjohnson/status/1458886350824894477
The setTexture
function isn't documented, though. And it's a bit out of form with the rest of the three.js API where other render target storage types have their own class definition so it's unclear to me if should be considered temporary or not.
@gkjohnson thanks for sharing your solution, it worked for me :)
Unfortunately, this method looses compression in GPU, but I don't see any other solutions. At least I have an ability to load CompressedTextures to GPU quickly.
THREE.CompressedArrayTexture
will be available with r146
👍 .
@Mugen87 How to render THREE.CompressedArrayTexture
? In this example, ShaderMaterial is used. If we don't want to modify the positions etc.. can we avoid using it?
I want to know if it's possible to access the decompressed textures by indices.
@CITIZENDOT the built-in three.js materials do not use array textures. You would need to either use ShaderMaterial, or modify a shader with custom GLSL, when using an array texture. This example might be a helpful example of how to modify a material's shader (although it does not use array textures specifically).
What's missing to close this issue? A Compressed3DTexture class?
Somewhat related:
What's missing to close this issue? A Compressed3DTexture class?
AFAICT, yes. Compressed 3D textures can't be used yet since WebGLTextures
does not support the combination of gl.compressedTexSubImage3D()
and gl.TEXTURE_3D
. Only gl.TEXTURE_2D_ARRAY
is supported which was implemented via #24745.
Would be nice to have a way for compressed 3D Textures and TextureArray for custom shaders.
Request is done after: https://discourse.threejs.org/t/is-it-possible-to-have-a-texture-array-of-compressed-textures/16213
Browser
OS
Hardware Requirements (graphics card, VR Device, ...)