Open MendyBerger opened 7 months ago
Swapchains are typically not implemented by the graphics API itself but are implemented in the underlying windowing system that the graphics API is hooking. In DirectX, this is the IDXGISwapChain
COM interface / Present
method. In OpenGL, this is an API call to swap buffers to the windowing system, for example glXSwapBuffers
or glfwSwapBuffers
.
I don't have a strong opinion on this. graphics-context
may make sense since it is not as platform specific as the canvas attachment (and the underlying canvas attachment for Web could just NOOP).
In that case, it should probably be on the canvas, right?
If I understand the spec correctly, the browser mirrors the current texture of a GPUCanvasContext to the screen automatically, at least if you draw in requestAnimationFrame (I couldn't figure out what exactly "When an HTMLCanvasElement has its rendering updated." means). In native implementations, you need to signal that you're done with the current texture and hand it back to the swapchain to be displayed. wgpu
uses a special SurfaceTexture struct that owns a swapchain texture and has a present method to queue it for presentation. The closest analog to that would probably be the mini-canvas
here.
Web-gpu doesn't have a present-texture function. On the web, that's not necessary. Instead, whenever you yield to the event-loop, it'll just present whatever the current texture contains. This works fine on the web, since it's single threaded, and automatically presents every few milliseconds. But won't work for wasi. I believe that we'll need a
present
method somewhere, the question is on what resource? Here are the options that I'm thinking of:graphics-context
graphics-context-buffer
canvas
gpu-texture
/frame-buffer
Currently I'm leaning towards
graphics-context
orgraphics-context-buffer
, since I expect all gpu-apis will need this.(cc @seanisom, please correct all my mistakes here 😂)