Vulkan is very verbose. To render something that rotates, you need to:
Gather the vertexes in a list
Allocate memory for the vertexes on the GPU
Copy the vertexes to the GPU
Allocate memory for the vertexes on the GPU
Copy the vertexes from the GPU... to the GPU...
Tell Vulkan how to interpret the vertex data
Use it to create a graphics pipeline with the proper Descriptor Sets and shader files
Allocate memory for the uniform buffer OR
Tell Vulkan that you're going to use a special method to update the shader
Upload the rotation data
Submit the vertices to the GPU for rendering
That's very annoying, and extremely hardcoded for a simple object. What's worse is making a complicated object, with constraints, hinges, animations, textures, specular, cubemaps...
To make this easier would require some serious refactoring.
The Theory
Some sort of builder pattern, whereby you give each bit of geometry the data it needs to construct its vertex buffers and pipeline automatically, would help greatly.
To be able to render a skybox, for example, you only need:
A model
A texture (rather, 6 textures)
A fragment and vertex shader
A matrix to rotate it
To be able to render something like a walrus, you need:
A model
A texture
A fragment and vertex shader
A uniform buffer to light it
A matrix to rotate and move it
This materialized in the form of ModelBuilder, and its associated Model that stores all the data necessary to render an object.
Simply by passing the VkCommandBuffer currently being used to draw, allows it to handle it all automatically.
Likewise, to actually submit something for rendering, we have to:
Allocate a command buffer pool
Create a command buffer
Update attachment backings (swap the swapchain images if necessary, etc)
Create a Render Pass
Wait for a frame to be ready to render
Listen on the command buffer
Enable the Render Pass
Submit Geometry to be rendered
Move to the next Render Pass (if necessary)
Submit Geometry to be rendered (if necessary)
Stop listening on the command buffer
Tell Vulkan a new frame is ready to render
Submit the command buffer to the GPU
Reset state for a new frame to begin
That's a LOT. If we want to include Compute Shaders in this, it gets a lot more complicated.
To handle this, the GenericRenderPass, ScreenRenderPass, ComputeRenderPass, and RenderCommand abstractions were made.
Very simply, this takes all the bullshit away from rendering with Vulkan.
The Neat Bit
The new renderer can automatically keep track of things like textures, that are no longer used.
Say, a Model is destructed. The texture and vertex buffers we uploaded to the GPU are no longer necessary, so we can discard them, but doing that manually is painful.
RefCountedTexture allows for a texture to be reused by multiple objects, and deleted from the GPU when no longer accessed by a single object, without need for manual tracking or resource destruction.
Likewise, all the Buffer abstractions can automatically handle deleting their linked data when destructed, along with all the Images and VertexBuffers that use them.
Necessary changes
The Module Manager now stores the renderer as a special item, and there's a specialcase for RendererModule that contains some functions that the game needs to know about the renderer to be able to do its thing.
Extra Features
Just to be nice, i also:
Made it work with macOS
Fixed window resizing
Added an Editor Mode that makes the engine render into an ImGui window that can be moved and manipulated
Put a more complex example into the Game Module to demonstrate the power of the renderer
Added some abstractions for things that aren't currently necessary (such as Compute Shaders)
The Why
Vulkan is very verbose. To render something that rotates, you need to:
That's very annoying, and extremely hardcoded for a simple object. What's worse is making a complicated object, with constraints, hinges, animations, textures, specular, cubemaps...
To make this easier would require some serious refactoring.
The Theory
Some sort of builder pattern, whereby you give each bit of geometry the data it needs to construct its vertex buffers and pipeline automatically, would help greatly.
To be able to render a skybox, for example, you only need:
To be able to render something like a walrus, you need:
This materialized in the form of ModelBuilder, and its associated Model that stores all the data necessary to render an object. Simply by passing the VkCommandBuffer currently being used to draw, allows it to handle it all automatically.
Likewise, to actually submit something for rendering, we have to:
That's a LOT. If we want to include Compute Shaders in this, it gets a lot more complicated. To handle this, the GenericRenderPass, ScreenRenderPass, ComputeRenderPass, and RenderCommand abstractions were made.
A standard render loop now looks like this:
Render passes with multiple subPasses can be handled too:
Very simply, this takes all the bullshit away from rendering with Vulkan.
The Neat Bit
The new renderer can automatically keep track of things like textures, that are no longer used. Say, a Model is destructed. The texture and vertex buffers we uploaded to the GPU are no longer necessary, so we can discard them, but doing that manually is painful.
RefCountedTexture allows for a texture to be reused by multiple objects, and deleted from the GPU when no longer accessed by a single object, without need for manual tracking or resource destruction.
Likewise, all the Buffer abstractions can automatically handle deleting their linked data when destructed, along with all the Images and VertexBuffers that use them.
Necessary changes
The Module Manager now stores the renderer as a special item, and there's a specialcase for RendererModule that contains some functions that the game needs to know about the renderer to be able to do its thing.
Extra Features
Just to be nice, i also: