SGI Fellows this year reported some errors and matlab crashing when using the png2poly function with this example image, so I decided to investigate it.
The main problem was: the function mask2poly may return polygons with duplicated vertices. When the Laplacian Smoothing is applied to this polygons, it may cause some self-intersections, like in the example bellow, where a cube with 5 vertices (1 duplicated), turns into a non-simple polygon after 3 iterations:
When the triangulate function in point_inside_polygon tries to triangulate an polygon with duplicated vertices, or a non-simple polygon, it crashes.
As the Laplacian Smoothing parameter used is 1.0, new duplicated points may be generated in one of the smoothing iterations and some polygons may colapse instantaneously, so I propose the use of 0.5 as the new parameter, as it guarantees no duplicated points and no self-intersection are generated for most cases. Also, when the Laplacian Smoothing is applied to a closed curve, it converges to a single point, and with many iterations some polygons become really small, which also makes the triangulate crash.
The last bug found was in the point_inside_polygon function: the triangulation may return some extra steiner points, and they were not beeing used, causing an indexing error.
This pull request remove duplicate points, change the laplacian smoothing parameter to 0.5, remove polygons that converged (area <= 10⁻¹⁵), and use the steiner points, generated in the triangulation, in the barycenter function. The function now looks much more stable in the tests I did.
Problems identified and not fixed:
The new approach using triangulation in the point_inside_polygon function require a well behaved input, and crashes/throws errors otherwise. It's hard to guarantee that, so maybe going back to the last ray-intersection, or some other algorithmic approach could be a good idea.
Some polygons were classified as holes by the mask2poly even though they were not.
SGI Fellows this year reported some errors and matlab crashing when using the
png2poly
function with this example image, so I decided to investigate it.The main problem was: the function
mask2poly
may return polygons with duplicated vertices. When the Laplacian Smoothing is applied to this polygons, it may cause some self-intersections, like in the example bellow, where a cube with 5 vertices (1 duplicated), turns into a non-simple polygon after 3 iterations:When the
triangulate
function inpoint_inside_polygon
tries to triangulate an polygon with duplicated vertices, or a non-simple polygon, it crashes.As the Laplacian Smoothing parameter used is 1.0, new duplicated points may be generated in one of the smoothing iterations and some polygons may colapse instantaneously, so I propose the use of 0.5 as the new parameter, as it guarantees no duplicated points and no self-intersection are generated for most cases. Also, when the Laplacian Smoothing is applied to a closed curve, it converges to a single point, and with many iterations some polygons become really small, which also makes the triangulate crash.
The last bug found was in the
point_inside_polygon
function: the triangulation may return some extra steiner points, and they were not beeing used, causing an indexing error.This pull request remove duplicate points, change the laplacian smoothing parameter to 0.5, remove polygons that converged (area <= 10⁻¹⁵), and use the steiner points, generated in the triangulation, in the barycenter function. The function now looks much more stable in the tests I did.
Problems identified and not fixed:
point_inside_polygon
function require a well behaved input, and crashes/throws errors otherwise. It's hard to guarantee that, so maybe going back to the last ray-intersection, or some other algorithmic approach could be a good idea.mask2poly
even though they were not.