Open marwie opened 8 months ago
I can't reproduce this in gltfpack. I get the following mesh with simplification factor 0.006 (going lower than this triggers a safety check in gltfpack code that should probably be reworked somehow, so the mesh stays in its original form):
In theory this should have gone all the way down to one quad, but I think there's some tricky edge conditions that might be preventing this from happening.
On the sphere file, with the error threshold of 1e-2 gltfpack simplifies down to 82 triangles:
To clarify: the issue outlined here is that decimation in some cases doesn't lead to a somewhat uniform simplification (as would be expected), but instead leads to a weird state where some part of the mesh is aggressively reduced while other parts stay at original density: (e.g. this is "ratio: 0.5" from 0:02 in the video)
It almost looks like decimation goes vertex by vertex and "stops" once the target ratio is reached; expected would be a more uniform approach that leads to somewhat similarly sized triangles.
similarly sized triangles
This is not a criteria that the simplifier takes into account. The appearance of a tessellated plane is the same regardless of which interior edges you collapse. The error of every version on this video is close to 0.
Interesting! I wasn't aware that meshoptimizer doesn't optimize for similarly sized triangles, which to be best of my understanding are another property that is important for efficient GPU utilization (avoiding small triangles and aiming for a uniform triangle distribution on screen).
We noticed this issue on other meshes (e.g. spheres in some cases exhibit "nests" of lots of non-optimized triangles) and also in production meshes where in some areas nothing was optimized and other areas got optimized aggressively. A plane was just the easiest to demonstrate. Sounds like we'll need to provide a better mesh that shows the issue if you say that this is essentially by design.
All things being equal, more uniform triangle density is definitely preferred for efficiency; the simplifier right now is predominantly concerned with appearance. I was considering some sort of edge length metric addition which becomes more important for better attribute treatment but right now the metric is purely "distance from the original surface", which is invariant to density. Introducing additional non-critical factors like this is difficult because they distort the original quality metric, so careful tuning is critical.
My experience has been that on production meshes you mostly don't run into this problem - you can still have areas where nothing is optimized of course, but when that happens that usually happens because of topological constraints, not because of error constraints. I'd welcome examples where that's a problem, with exact values of error & target count so that I can reproduce this more easily.
Based on the papers, this could be helped by adding a pointwise quadric that is the distance to the original point. A=I, b=-p c=p*p, all multiplied by the weight. I don't think this would solve the issue with getting rid of the last few edges down to one quad or triangle. But it would probably help in the intermediate stages.
Hello, below is a video of a 100 vertices plane simplified with ratios [0.5, 0.25, 0.125, ...] and an error of 1
It seems like the simplification is suddenly stopped once a threshold is reached where areas of the mesh then stay untouched. We also observed this on a relatively high density sphere where it resulted of dense islands
Expected behavior Simplification is uniformly applied
assets.zip
https://github.com/zeux/meshoptimizer/assets/5083203/24eb2275-da32-4312-bfce-1471a197e7d0