Multiple UVs causes my animated meshes to disappear. I populate UV0 most times, but if I need to use a morph target for animation I'll use UV1 to address lines in the morph target texture. But after using UV1 the mesh straight up disappears, even if I assign it a blank material.
High Precision Texture Coordinates seem broken as well. If I turn them on for a regular mesh, the texture coords become corrupt and cause massive texture stretching.
First screen shot is only allowing 1 UV for a morph target animated mesh, so it's vertices just go everywhere, that's fine, but screenshot two is using 2 UV channels, mesh just stops working
I've passed it through nvidia nsight and using two uv channels it doesn't even attempt to render the mesh, so I think it's failing at an earlier stage
Seems to fail RealtimeMeshSimpleData.h Line 182, it checks the layout against some layout types but it matches none of them, so skips adding texture coordinates entirely.
If I comment out the code that does the buffer layout checks for texture coords, and just assume they are always 2DHalf I can get the mesh to appear with multiple UVs. However, those UVs never actually make it to the GPU from the Vertex Factory and so my material reads into junk data.
I believe this is because the vertex factory never actually gets the correct number of tex coords when using multiple UVs. I've solved it in the past in the FRealtimeMeshLocalVertexFactory::Initialize(const TMap<FRealtimeMeshStreamKey, TSharedPtr>& Buffers) function by adding the code
https://discord.com/channels/455826886938066986/457247782672269333/1169568918717878322
Multiple UVs causes my animated meshes to disappear. I populate UV0 most times, but if I need to use a morph target for animation I'll use UV1 to address lines in the morph target texture. But after using UV1 the mesh straight up disappears, even if I assign it a blank material.
High Precision Texture Coordinates seem broken as well. If I turn them on for a regular mesh, the texture coords become corrupt and cause massive texture stretching.
First screen shot is only allowing 1 UV for a morph target animated mesh, so it's vertices just go everywhere, that's fine, but screenshot two is using 2 UV channels, mesh just stops working
I've passed it through nvidia nsight and using two uv channels it doesn't even attempt to render the mesh, so I think it's failing at an earlier stage
Seems to fail RealtimeMeshSimpleData.h Line 182, it checks the layout against some layout types but it matches none of them, so skips adding texture coordinates entirely.
If I comment out the code that does the buffer layout checks for texture coords, and just assume they are always 2DHalf I can get the mesh to appear with multiple UVs. However, those UVs never actually make it to the GPU from the Vertex Factory and so my material reads into junk data.
I believe this is because the vertex factory never actually gets the correct number of tex coords when using multiple UVs. I've solved it in the past in the FRealtimeMeshLocalVertexFactory::Initialize(const TMap<FRealtimeMeshStreamKey, TSharedPtr>& Buffers) function by adding the code
at line 189 of RealtimeMeshVertexFactory.cpp, and it seems to fix it, but I'm not sure I'd be happy about that code being in there, seems messy.