I've been experimenting with GPU.js for GPGPU and one thing I'd like to be able to do for us to adopt this library is to pipe a texture coming out of a GPU.js kernel into a three.js shader for rendering. As far as I'm able to tell, Three.js shaders use DataTextures, essentially ImageTextures which seem incompatible with gpu.js's Float32Array textures.
Where does it happen?
In GPU.js when using setPipline(true) and trying to use the resulting texture with three.js
How do we replicate the issue?
Produce a Texture from a Kernel by using setPipeline(true)
Pass in the resutling Texture as a uniform to a three.js ShaderMaterial
Try to access the values of that Texture in GLSL from the shader to affect rendering
How important is this (1-5)?
5: This will determine wether or not we can use GPU.js for our use cases
Expected behavior (i.e. solution)
I can use the GPU.js texture in our existing GLSL shaders to affect three.js rendering.
Other Comments
Thanks for all the work you've put into this library, I've really been enjoying playing around with it. I'm not super experienced with GPU computing, so maybe there's an obvious way to achieve this that I'm missing. If so, I would love to see an example of something like that working.
What is wrong?
Hello guys!
I've been experimenting with GPU.js for GPGPU and one thing I'd like to be able to do for us to adopt this library is to pipe a texture coming out of a GPU.js kernel into a three.js shader for rendering. As far as I'm able to tell, Three.js shaders use DataTextures, essentially ImageTextures which seem incompatible with gpu.js's Float32Array textures.
Where does it happen?
In GPU.js when using
setPipline(true)
and trying to use the resulting texture with three.jsHow do we replicate the issue?
How important is this (1-5)?
5: This will determine wether or not we can use GPU.js for our use cases
Expected behavior (i.e. solution)
I can use the GPU.js texture in our existing GLSL shaders to affect three.js rendering.
Other Comments
Thanks for all the work you've put into this library, I've really been enjoying playing around with it. I'm not super experienced with GPU computing, so maybe there's an obvious way to achieve this that I'm missing. If so, I would love to see an example of something like that working.