Closed foxyseta closed 2 years ago
This is a good question!
To have a fair comparison of the two measures, you'd first have to make sure, that the relax parameter is the only free parameter. The relax parameter affects the number of free vertices in the mesh and their connectivity. This has a large influence on the smoothing process and thus on the Q measure. Make sure to allow a large number of iterations and set the tolerance to a smaller value, so that the quality measure can actually get near its maximum.
What I'm comparing in example 4 is the optimal triangle size (equilateral, assuming a uniform boundary segment size) and the average area, considering only the total area of the geometry and the triangle count. This is obviously not a good measure. A reasonable choice would be the median of all triangle sizes. This would probably be closer to your definition of area error.
I plotted some of the quality measures for a triangle and a circle geometry, with low and high number of iterations of the smoother. We can see that
The weird behavior of the area measures is probably connected to at what point the number of Steiner points changes. Also, the number of Steiner points does not decrease continuously with increasing relax parameter. The different behavior of area regularity may be explained by the fact, that the triangle shape has smaller input angles, so that's a hard limit for the largest minimum angle. .
Btw, for the above tests I set SegmentSplitting = 1
to make sure the boundary segments don't change. Vertices added to the boundary during refinement would introduce irregularities in the smoothing process. This should also be changed in the current code of example 4.
Thank you so much! That was really insightful. On my end, my doubts have been dispelled. I'm leaving this issue open since it is related to that SegmentSplitting = 1
fix you mentioned, but feel free to close this whenever you think the time is right.
Closing this one. Though to prevent segment splitting might be desirable, it defeats the conforming Delaunay option, which is needed for the Voronoi diagram.
In #21, I was informed that
TriangleNet.Examples.Example4.relax
is indeed:The following geometry is a $50 \times 50$, right-angled triangle that is also half a square. The edge size for its single element was set to $5$.
Different relaxation factors brought upon different average element qualities and average area errors relative to the computed maximum element area.
For each element, its quality was computed as:
$$Q = \frac{4 \sqrt{3} S }{ l_{max} P}$$
( $S$ being the element surface, $l_{max}$ being the element's maximum edge length, and $P$ being the perimeter.) This is definitely more naive than
QualityMeasure
, but its results still show that both qualities are satisfying.For each element, its area error relative to the computed maximum element area is:
$$err = \frac{S - S_0}{S_0}$$
( $S$ being the element surface, $S_0$ being the computed maximum element area (w/o the relaxation factor)) As noted in the two screenshots, shifting the relaxation factor from $1.45$ to $1.80$ generated elements whose extension was way closer to the ideal equilateral triangle's. So is the actual point of having a relaxation factor actually something other than guaranteeing an appropriate average element size?