Closed HyeokSuLee closed 3 years ago
We don't have any support for geometry shaders, and it isn't planned either (unless there is a strong need for it). WebGPU upstream is unlikely going to get it at all.
What I suggest to look into, instead, is https://github.com/gfx-rs/gfx/issues/2878
A related question: is there a plan to be able to generate vertex data in a compute shader and pass that to the vertex shader? The use case I have in mind is a GPU-based 2D rendering library in which one would like the user to be able to push high-level primitives (such as a circle) as a "single" vertex along with all its data (center, radius, color...) as attributes to the GPU, and there have a geometry/compute shader able to generate triangles to be passed to the fragment shader (such as 8 or 16 triangles to avoid drawing too many transparent pixels).
is there a plan to be able to generate vertex data in a compute shader and pass that to the vertex shader
There doesn't need to be a plan for this. Your compute shader can write to a buffer, which then can be used as vertex buffer. You can do this today.
Closing due to wgpu-rs -> wgpu transition and because there is an issue on gfx already. Please refile an issue on the wgpu repo if this is another issue.
I'm gonna make xr vision. Is there way to use geometry shader? I cannot find. And is this good way to use geometry shader for duplicate vision? Or just duplicate vertices(for xr vision) in cpu and send it to gpu?