Closed Earthmark closed 3 years ago
This won't get implemented, sorry. Instancing requires the mesh instance itself to be exactly the same, it won't work across different meshes, even if they have the same geometry. Any of the properties can change at any moment so this would require some complex system to track which procedural meshes are the same, which has on itself a performance cost. Considering that often times the boxes are different sizes and can't be instanced anyway, this wouldn't be worth it.
Baking the geometry or making a single instance of the mesh and then just referencing it from the MeshRenderer is the proper solution for this. I could potentially add a function to the asset optimizer to find and deduplicate all procedural meshes with same settings, same way materials are, but this would also require user action to optimize.
It may help me to understand the math involved as you see it, cause I think I considered that as part of this, but I'd like to better understand where I'm misunderstanding.
My proposal is that instead of each vertex of the box being referenced as follows, with TRS(node) being the TRS transform of a node,
TRS(root) TRS(slot1) TRS(slot2) ... vertex the shift would be TRS(root) TRS(slot1) TRS(slot2) ... Scale(box size) * vertex_of_unit_cube
So a [1,2,5] box would currently be rendered as [0.5, 1, 2.5],[-0.5, 1, 2.5]... But with this it would be rendered as [1, 0, 0, 0] ([0.5] [-0.5] ) [0, 2, 0, 0] * ([0.5] [0.5] ) [0, 0, 5, 0] ([0.5] [0.5] ) [0, 0, 0, 1] ([ 1], [0.5]... ) and so on, for each vertex, just like the TRS chain
Obviously baking would use a different path, but this way instancing could occur over different flavors of procedural mesh, cause all the procedural bit does is apply a scale to a unit cube. I get that for scale with UV this is different, because that's applying to the UV as well.
Perhaps that's where I'm getting this wrong, my assumption is the shader input is position, normal, and uv cords. As I see it normal and uv cords are identical for every procedural box where scale uv is disabled, the only difference is position, which can be factored out math wise as a matrix scale (as shown above).
The math is completely irrelevant here. Like I mentioned, the only thing that matters is whether the instance of the mesh is the same or not. If you reference the same mesh with MeshRenderer, the rendering pipeline will instance it automatically if possible. However using on-uniform scaling can break instancing.
The shaders themselves are generic, they don't render cube specifically. They can't make assumptions about the type of the mesh that you're using. Instancing works by rendering the same mesh multiple times at different positions in one go, which is done by the rendering pipeline before it reaches the shader itself. It can only instance objects that use the same shader setup (same mesh and same material).
Ahh, I think I get it now.
Also I didn't realize non uniform scale broke instancing... that's good to know.
It seems boxes are a common world building primitive, especially with triplanar materials. However in some very primitive testing, it seems procedural meshes are not instanceable via the normal instancing methods (Performance dramatically improved when a single, commonly shared procedural box mesh was baked when referenced from a great many mesh renderers)
To better facilitate this primitive can, when 'scale uv with size' is disabled, all procedural box meshes be eligible for instancing with each other by making the procedural box a scale transform of a unit cube. I think this would allow every piece of box geometry using a single triplanar material to be a single instanced draw call, instead of a draw call per cube.
For instance this would remove a few dozen draw calls from tutorial world, as it uses triplanars so the textures mesh well, but it's all box geometry.