Mattey40 / poly2tri

Automatically exported from code.google.com/p/poly2tri
Other
0 stars 0 forks source link

Stack overflow on 204 point simple polygon input (C++) #34

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. Triangulate a particular point set

What is the expected output? What do you see instead?
Stack overflow leading to segfault. 

What version of the product are you using? On what operating system?
Most recent code base, Linux 2.6.32-33-generic x86_64. I will test with MinGW 
compiled version soon. 

Please provide any additional information below.

Included file contains a double array where rand_test[2*i] is poly[i].x and 
rand_test[2*i+1] is poly[i].y

I encountered this stack overflow when running a stress test on my convex 
decomposition code. I generated a complex polygon by stringing together 
entirely random points in the range -2000 to 2000 in both x and y, and used a 
custom polygon simplification algorithm to extract the perimeter of the 
polygon. 

If you examine this test case you will notice that it is a completely valid 
polygon without extreme angles anywhere, it just has a few small triangular 
features. In any case there should not be a stack overflow here. Here is 
relevant part of stack trace: 

#93544 0x0000000000427d44 in CDT_testRoutine (data=0x6b3f80, 
    length=408) at Polygon.cpp:318
318     cdt.Triangulate();
(gdb) 
#93543 0x000000000047ff97 in p2t::Sweep::Triangulate(p2t::SweepContext&) ()
(gdb) 
#93542 0x000000000047fddc in p2t::Sweep::SweepPoints(p2t::SweepContext&) ()
(gdb) 
#93541 0x000000000047f5a9 in p2t::Sweep::EdgeEvent(p2t::SweepContext&, 
p2t::Point&, p2t::Point&, p2t::Triangle*, p2t::Point&) ()
(gdb) 
#93540 0x000000000047f18e in p2t::Sweep::FlipEdgeEvent(p2t::SweepContext&, 
p2t::Point&, p2t::Point&, p2t::Triangle*, p2t::Point&)
    ()
(gdb) 
#93539 0x000000000047f18e in p2t::Sweep::FlipEdgeEvent(p2t::SweepContext&, 
p2t::Point&, p2t::Point&, p2t::Triangle*, p2t::Point&)
    ()
(gdb) 
#93538 0x000000000047f18e in p2t::Sweep::FlipEdgeEvent(p2t::SweepContext&, 
p2t::Point&, p2t::Point&, p2t::Triangle*, p2t::Point&)
    ()
(gdb) 
#93537 0x000000000047f18e in p2t::Sweep::FlipEdgeEvent(p2t::SweepContext&, 
p2t::Point&, p2t::Point&, p2t::Triangle*, p2t::Point&)
    ()
(gdb) 

Good luck!

Original issue reported on code.google.com by stevenlu...@gmail.com on 19 Dec 2011 at 10:49

Attachments:

GoogleCodeExporter commented 9 years ago
Forgot to mention: The two lines I've got commented out are two points that 
create a rather thin spike in the geometry. I had hoped that this "extreme" 
shape was the cause of the issue but turns out I still get a stack overflow 
without it. 

If somebody could tell me what sort of features or properties about this input 
are bringing about this stack overflow, and what sorts of input I should try to 
avoid, that'd be great also. It's got me puzzled because I have inspected the 
geometry and there's nothing particularly pathological about it. 

Original comment by stevenlu...@gmail.com on 20 Dec 2011 at 3:01

GoogleCodeExporter commented 9 years ago
I made a quick last-ditch effort which somehow worked: Avoided the stack 
overflow by shifting all vertices so that all position values were positive. 

This is quite interesting. I'm gonna re run my stress tests, forcing all input 
above the axes. Will post back w/ results.

Original comment by stevenlu...@gmail.com on 20 Dec 2011 at 3:07

GoogleCodeExporter commented 9 years ago
Continued testing, went a bit further, this one's a failed assertion on a 277 
vertex shape. 

I tried this input shifted +2500.0 on both x and y and that went through 
without an error, so it seems like it's not something about the shape itself 
that's causing these runs to fail. Please verify these results...

Original comment by stevenlu...@gmail.com on 20 Dec 2011 at 3:28

Attachments:

GoogleCodeExporter commented 9 years ago
I ran your poly2tri_stackoverflow.c pointset including the two commented points 
on the Java version of poly2tri and it worked fine.

I tried to find anything wierd that could cause some precision issues and there 
is a case where we got some almost collinear edges when doing the flip part of 
the constrained algorithm. See attached image. Maybe doesn't say so much 
without some explanation :P

I tried some different values for the epsilon used in utils.h InScanArea test 
method.
I did this with the Java version.
It will only fail for me if I use epsilon <= 1e-15 and the default is 1e-12. 
You could try 1e-11, but keep 1e-12 for the orient2d test. So create a new 
epsilon just for the InScanArea method.

Original comment by thahlen@gmail.com on 20 Dec 2011 at 6:30

Attachments:

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
poly2tri_assertion.c also worked fine on the Java version.

Since it works it can't be an algorithmic issue. Must be some precision thing, 
but can't understand why it would work with Java but not C++.

Well one thing my debugger code does is to find the bounding box of the 
pointset and center the pointset around 0,0 and rescale it to range 0-1. If 
that could have any impact.

Original comment by thahlen@gmail.com on 20 Dec 2011 at 6:45

GoogleCodeExporter commented 9 years ago
It does! If I don't recenter and rescale it I will get a Stack Overflow to. 
Will look into this closer to try to find what fails.

Original comment by thahlen@gmail.com on 20 Dec 2011 at 7:10

GoogleCodeExporter commented 9 years ago
Try with the second dataset also. It's an assertion failure rather than stack 
overflow. 

Original comment by stevenlu...@gmail.com on 20 Dec 2011 at 6:38

GoogleCodeExporter commented 9 years ago
I have analyzed this further and the issue was my initial guess. The thing is 
the basic three point orientation test that is used extensively in this lib is 
scale dependent. E.g. The value I use for epsilon to check for collinearity 
will also be dependent on scale. Thats is why when comparing to 1e-12 works 
when the pointset is rescaled but not at original scale. The second one pass 
the test. If you change the epsilon for InAreaScan to 1e-11 your pointset 
should work with original scale.

I have always been scaling my dataset to the range -1,1 and haven't considered 
this until now. 

I have to think some more on this.

I cannot reproduce the error in the second dataset, what assertion does it fail 
on?

Original comment by thahlen@gmail.com on 20 Dec 2011 at 11:34

GoogleCodeExporter commented 9 years ago
sweep/sweep.cc:715: p2t::Point& p2t::Sweep::NextFlipPoint(p2t::Point&, 
p2t::Point&, p2t::Triangle&, p2t::Point&): Assertion `0' failed.

(second dataset)

So you say I should try to limit my domain to -1,1? I'll go give that a spin. 

Original comment by stevenlu...@gmail.com on 20 Dec 2011 at 11:53

GoogleCodeExporter commented 9 years ago
Here's one in the [-1,1] range. 

I've got some information on this data-set that may be of help. On line 19 is a 
point which is 0.00006 distance from the point on line 17. Removing that 
however does not fix the problem on my machine. Once I nuke line 140, though, 
no stack overflow. The point on line 140 is 0.00008 from the point on line 138. 
Happens to be the 2nd smallest distance between points (I check dist from point 
to the one before and the one before that. No adjacent points are ever very 
close because I run ramer douglas peucker on my set)

You say epsilon is 1e-11? What would be the size of the smallest feature I can 
afford to have? 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 12:24

GoogleCodeExporter commented 9 years ago
Sorry forgot attachment. 

Points of interest that I mentioned are indented in there. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 12:25

Attachments:

GoogleCodeExporter commented 9 years ago
[deleted comment]
GoogleCodeExporter commented 9 years ago
I got a "spine eliminator".. Here the features are at minimum 0.0001 across at 
the root. 

I'll just keep sending you failed input data as I encounter them. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 1:00

Attachments:

GoogleCodeExporter commented 9 years ago
Overflow2 was exactly the same issue as before. After putting my thinking cap 
on for a while I realized that I could probably improve the precision a bit by 
just reordering the way I do the InAreaScan test. 

Updating the Java version solved the problem.

Below is the new InAreaScan in utils.h. 
Please try it and let me know how it works.

bool InScanArea(Point& pa, Point& pb, Point& pc, Point& pd)
{
  double oadb = (pa.x - pb.x)*(pd.y - pb.y) - (pd.x - pb.x)*(pa.y - pb.y);
  if (oadb >= EPSILON) {
    return false;
  }

  double oadc = (pa.x - pc.x)*(pd.y - pc.y) - (pd.x - pc.x)*(pa.y - pc.y);
  if (oadc <= EPSILON) {
    return false;
  }
  return true;
}

Original comment by thahlen@gmail.com on 21 Dec 2011 at 5:48

GoogleCodeExporter commented 9 years ago
Alright, I can confirm my previously failing cases that I presented are now 
working with that replacement code. Thanks!

I'll come back if I run into any more similar issues. I've found a rather 
frustrating issue with my own segment-segment intersecting code... here follow 
the link if you're interested. 
http://stackoverflow.com/questions/8585427/precision-issues-with-segment-segment
-intersection-code

thanks again.

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 6:16

GoogleCodeExporter commented 9 years ago
Damn. That didn't take long. 

See if this one asserts for you. Did for me. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 6:22

Attachments:

GoogleCodeExporter commented 9 years ago
In this lib points need to be separated by atleast Epsilon, e.g 1e-12.

There hasn't been any floating point analysis done on the lib. It's precision 
has been enough for anything I have needed it for to date. I started to look 
into triangulation when I needed to triangulate some 2d fonts, which are pretty 
simple polygons :)

I picked epsilon 1e-12 after running some polygon generation code that 
generated some nasty polygons. Did a circle sweep polygon with some function 
for altering the radius. After increasing the points to around 500k something. 
I found that epsilon 1e-13 was where the algorithm broke down in precision when 
using so many points so to be safe I felt that 1e-12 would be enough precision 
for almost any triangulation needs :)

Original comment by thahlen@gmail.com on 21 Dec 2011 at 6:23

GoogleCodeExporter commented 9 years ago
I see. I did notice the debug geometry you have which is circular and very 
spikey. But my method of generating test geometry is a bit more involved and 
produces more random angles and stuff. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 6:26

GoogleCodeExporter commented 9 years ago
Hehe.. keep em comming. Anything that can improve the lib is nice :)

The last dump is wierd. On the first triangulation it works but if I run it a 
second time with same set I get the assert error to.

Looking into that. 

Original comment by thahlen@gmail.com on 21 Dec 2011 at 6:40

GoogleCodeExporter commented 9 years ago
The last dump works fine with the old InScanArea method :(.

I'll get to the depth with this later. Guess it might be trickier than I 
expected.

Original comment by thahlen@gmail.com on 21 Dec 2011 at 6:47

GoogleCodeExporter commented 9 years ago
Yeah, I'm generating a few more cases so you have enough of them to play with. 
Soon enough I'll have the entire thing automated...

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 6:51

Attachments:

GoogleCodeExporter commented 9 years ago
3 more

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 6:59

Attachments:

GoogleCodeExporter commented 9 years ago
another

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 7:15

Attachments:

GoogleCodeExporter commented 9 years ago
new random seed

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 7:20

Attachments:

GoogleCodeExporter commented 9 years ago
I did something silly and some of those datasets may run fine. In that case 
don't worry about it. But most of them either asserted or overflowed on me. 
I'll be working on coming up with some different shapes now.

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 7:37

GoogleCodeExporter commented 9 years ago
I did something silly with the new InAreaScan to

  if (oadb >= EPSILON) {
    return false;
  }

should be

  if (oadb >= -EPSILON) {
    return false;
  }

The original old issues didn't get fixed by the new InScanArea. I might have to 
tweak the algorithm a bit to fix this.

Original comment by thahlen@gmail.com on 21 Dec 2011 at 8:17

GoogleCodeExporter commented 9 years ago
Alright, now here's something interesting. 

I made a random walk test, which is a 2D random walk. I have a vector that i 
incrementally walk forwards and turn slightly each vertex. Then I add a noisy 
random vector to that (which does not affect the walked vector itself). First I 
set the noise vector small, so it always walks farther than the noise. 

This passed all 5000 test cases I threw at it. I'm gonna run that one in a loop 
overnight. 

So then i modified it so the random factor got increased a bunch, so now I'm 
still walking my location around but each vertex sent has a ton of noise. Then 
with my perimeter algorithm I get a jaggy convex simple polygon out of that 
mess. It basically looks like my squares and circles from before, same 
jaggedyness but now theyre all amoeba-like too. 

These jaggy mofo's assert and overflow like crazy. I'm fairly certain I did 
these files right so you best check these ones out. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 8:19

Attachments:

GoogleCodeExporter commented 9 years ago
I would say the issue only crops up with protruding triangular bits. I might 
design some tests that produce protruding mini-quadrilaterals and pentagons to 
see if that affects it. 

But smooth-ish shapes, even with large numbers of vertices, are handled just 
fine.

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 8:22

GoogleCodeExporter commented 9 years ago
I ran non-jaggy randomwalk shapes overnight, it performed 55 loops of 5000 
tests, what I do is I make a 3 vertex shape and send it through my polygonator 
routine, then a 4 vertex shape, then 5, ... up to 5000, then I restart with a 
new random seed from 3. This went 55 times so that's a total of 687 million 
vertices processed. I had a memory leak which is why it stopped. 

Original comment by stevenlu...@gmail.com on 21 Dec 2011 at 10:59

GoogleCodeExporter commented 9 years ago
That sound pretty robust ;). Did poly2tri contain the memory leak?

I haven't tested your latest dumps yet. The few I did had the same issue and 
yes protruding is whats creates them. When you protrude from an edge you get 
two points in the protruded triangle on the exact same line. The way I have 
implemented my constraint algorithm gives a case where the algorithm can start 
to ping pong between three triangles that (almost) share a common line. 

If I had found this case before I released the lib I would certainly solved it 
before release. I am glad you found it :). This will actually be the first real 
algorithmic issue found since the release two years ago.

I am sure I can solve this by adding some code for that case, I just want to 
think it true and make a fast and good solution than just a hurried patch :).

Btw. I am interested what project you are using the lib for. I guess you don't 
triangulate polygons just for the fun of it ;)

Original comment by thahlen@gmail.com on 22 Dec 2011 at 2:15

GoogleCodeExporter commented 9 years ago
poly2tri has no memory leaks. I did bring up issue 33 on here which got 
dismissed, but the issue was that the testbed code did not bother to free the 
plist (vector<Point*>)'s pointers. The triangulator does not free these Points 
itself. It shouldn't. It's all logical and I don't think it can be classified 
as an issue, though I'd just prefer it if the c++ testbed code put in a loop to 
free the points at the end. 

I have fixed my leak and have ran 5+ billion vertices through the smooth random 
walk tests by now (got 2 cores crunching it, the shapes are mesmerizing to 
watch) so you bet your ass it's a robust algorithm. I'd just love to see the 
jagged shapes go through with it. As you can see from my data it only hits a 
few every hundred tests once it gets above 400 verts. I doubt the game engine 
will ever see many 400+vert dynamic geometries, but maybe one day I'll decide 
edge shapes aren't enough for my terrain. What you say about protrusions 
creating lines close together makes sense based on the method I am using to 
generate the geometry and I'm very glad you were able to narrow it down. I'm 
not rushing you, and an efficient fix would be best, all I'm gonna do is 
compile and drop it in anyway. 

I got interested in convex decomposition back in 2008-ish when I investigated 
methods to create unlimited geometries for Box2D. I stopped developing my game 
engine code at some point in 2009 because I didn't have very clear goals for 
where I wanted it to go, but now that I've graduated I'm looking at this core 
functionality again. I went through several algorithms for triangulation back 
in 2008 and the latest one was the CDT from GTS(gnu triangulated surface lib). 
It was always a pain in the butt to set up on my dev machines because of the 
glib dependencies. I wouldn't be surprised if that code also fails on these 
crazy tests I'm running now, but either way I'm sure poly2tri is more 
efficient. I am very glad to find a robust c++ implementation of CDT in 
poly2tri, and naturally once I finished my complex->simple polygon code and a 
triangle->concave polygon code I set about testing the routines.

I'm currently in the process of working out an efficient finite-element method 
for performing stress analysis on my geometries so that I can do more realistic 
dynamic breakage. Turns out this process should be generating data that I can 
use to build a full-on soft body simulation with. Depending on how difficult it 
will be to interface with Box2D I should end up with a pretty good framework 
for a game engine. 

Keep me posted on progress. Let me know if you need more test cases (though i 
think i gave you enough)

Original comment by stevenlu...@gmail.com on 22 Dec 2011 at 2:45

GoogleCodeExporter commented 9 years ago
correction at end of third paragraph: triangle->convex polygon code. 
Triangulator gets the concave data. 

Original comment by stevenlu...@gmail.com on 22 Dec 2011 at 2:48

GoogleCodeExporter commented 9 years ago
Check this out: on c++ poly2tri, there is a failure (sorry, cant remember what 
it was, segfault or assertion) if zero vertices are provided. 

This only cropped up after 35,854 cases. The first time that my three points 
were collinear within epsilon 1E-5. My random walk steps forward between 0.01 
and 0.5 each step. RDP nuked vertex #2 as a result and I end up with 0 vertex 
input to CDT.

Figured this would have happened before 35 thousand other attempts... 

Original comment by stevenlu...@gmail.com on 22 Dec 2011 at 3:37

GoogleCodeExporter commented 9 years ago
Any progress? Keep me posted. 

Original comment by stevenlu...@gmail.com on 24 Dec 2011 at 8:49

GoogleCodeExporter commented 9 years ago
I have an idea how to improve the algorithm. But it is also a precision thing.. 
so I really need to do a floating point precision analysis of the algorithms to 
find the best epsilon values to use. I haven't really done one of those before 
:P

The attached image is the current case I'm looking at with one of your polygons.
I guess you have intersected the edge ad with a tiny spike and get the 
intersection points b and c. The thing is that b and c is very close together 
and this is causing a precision issue with my current InAreaScan test. The 
points abc will be considered collinear, bcd will also be considered collinear. 
But abd will be just barely outside the epsilon value and be considered non 
collinear. When this happens the current algorithm will enter an endless loop.

This specific case works when I lower the epsilon to 1e-14. Bottom line is that 
I really need to decide what precision that can be used and if there is cases 
that the algorithm can't handle an throw an exception instead of entering an 
endless loop and cause stack overflow.

Haven't had time to look into this since it's christmas :-)

Original comment by thahlen@gmail.com on 25 Dec 2011 at 12:28

Attachments:

GoogleCodeExporter commented 9 years ago
Okay. Both b and c are points on ad, right? Why would abd or acd not also be 
collinear? It seems to me any combination of points in a,b,c,d should be 
collinear.

Original comment by stevenlu...@gmail.com on 25 Dec 2011 at 1:49

GoogleCodeExporter commented 9 years ago
Also, do you know what causes the assertion error? Is it caused by the same 
problem? 

Original comment by stevenlu...@gmail.com on 25 Dec 2011 at 1:50

GoogleCodeExporter commented 9 years ago
Yes in a perfect world any combination of a,b,c and d would be collinear. But a 
floating point world is discrete. Take all coordinates you can describe with a 
floating number and zoom in and you get discrete points in R2. 

My guess in this case is that b and c are very close to the line ad. When you 
draw the line ab the point d will be very close to that line. I could see that 
if b is much closer to a than d, even a tiny float precision offset of b could 
result in the lines ab and ad not being collinear, by a factor of 1e-13.

The best of all would be if I could change my algorithm to avoid these kind of 
tests. I just haven't figured out how to do that yet.

Original comment by thahlen@gmail.com on 25 Dec 2011 at 2:46

GoogleCodeExporter commented 9 years ago
Ah, I see what you mean now. 

If you know that both B and C are on the line AD could you treat the values 
parametrically for the collinearity test somehow? I guess you wouldn't have 
that knowledge to begin with. 

If this is only an issue when B and C are close together, could you merge them 
into a single point? The tiny sliver of a triangle in there is eliminated. 

Original comment by stevenlu...@gmail.com on 25 Dec 2011 at 3:05

GoogleCodeExporter commented 9 years ago
Now I know this would be a separate topic but what are your thoughts on 
extending poly2tri with Ruppert's? See 
http://www.cs.cmu.edu/~quake/tripaper/triangle3.html

I'm getting into FEA and though poly2tri supports the addition of steiner 
points, it seems to me that this refinement method can't be beat. What i'm 
puzzling over at this moment is whether it's possible to come up with the 
refinement vertices in a separate step, so that I could achieve the correct 
triangulation by using a CDT without having to modify the CDT algorithm. 

Original comment by stevenlu...@gmail.com on 25 Dec 2011 at 2:16

GoogleCodeExporter commented 9 years ago
I looked into refinement a bit when I worked with this lib two years ago. I 
found this interesting page:
http://www.cise.ufl.edu/~ungor/aCute/algorithm.html

Never got around to do anything tho since it wasn't anything I needed myself.

I guess you could run the CDT triangulation on the original polygon. Then you 
need to perform an incremental delaunay refinement, e.g find what triangle or 
triangle edge the new point is in then form the new triangles and do a delaunay 
operation on these. 

Yes the best part would be if Poly2Tri had an interface to incrementally add 
points into an existing triangulation.

Original comment by thahlen@gmail.com on 25 Dec 2011 at 11:48

GoogleCodeExporter commented 9 years ago
Yeah, being able to perform the point-addition operation on a triangulated 
state would be amazing. Do you think that would ever be a possibility with this 
codebase? 

I honestly can't believe they were able to improve upon Triangle! The examples 
posted on there are just beautiful. I wonder why Google was not able to clue me 
in about aCute this past week?

If poly2tri were to incorporate some of these features, and they don't have to 
do anything as optimal as aCute or Triangle, just allowing me to add a point to 
an existing triangulation and allowing me to measure the result, would be very 
good. With that ability I can probably build a halfway-decent algorithm for 
making a mesh, and I won't be bound by Triangle's license restriction. 

Original comment by stevenlu...@gmail.com on 26 Dec 2011 at 2:16

GoogleCodeExporter commented 9 years ago
I just found this!
http://code.google.com/p/poly2tri-c/

Original comment by stevenlu...@gmail.com on 26 Dec 2011 at 2:23

GoogleCodeExporter commented 9 years ago
Ah yeah there was someone who worked adding refinement to the c version.

Adding a point into a triangulation and refine should not be to hard. Most of 
the needed code exist in the sweepline implementation. The thing is to find 
which of the triangle the new point is in in an efficient way.There is one way 
to traverse triangles in the list until you find the right one that is pretty 
fast ofc. not as fast as creating some special search structure.

Original comment by thahlen@gmail.com on 26 Dec 2011 at 9:57

GoogleCodeExporter commented 9 years ago
I have another question for you. Does poly2tri include a sweepline/advancing 
front sub-quadratic time segment-segment intersection algorithm? Such a routine 
would help me optimize one of my routines, and I've been unable to find a good 
implementation of it. I'm referring to the Bentley-Ottman algorithm or anything 
similar.  

Original comment by stevenlu...@gmail.com on 3 Jan 2012 at 10:46

GoogleCodeExporter commented 9 years ago
Sorry for no more updates on this issue. I am about to start doing some tests.

Regarding the segment-segment intersection. Yes I have implemented such an 
algorithm and was going to include it with some future release. It works pretty 
well but there are still some special cases that need to be supported like when 
two segments are collinear and overlapping then the intersection is a segment 
itself. Just support point intersection for now.

I haven't implemented this one from any paper. So don't know if it is similar 
to Bentley-Ottoman. I just got the hang of the sweep-line algorithm when using 
it with the triangulation and saw the potential to use it to sweep 
line-segments and check for intersections.

It is implemented in Java and if you want I could extract it from my current 
codebase. I did most of this a month or two after the first poly2tri release so 
need to refresh my memory. Really want to finish this :). Could be used for 
some fast polygon operations. I think it could handle something like 10k 
segments and 40k intersections in 40ms, if my memory serves me right.

Original comment by thahlen@gmail.com on 4 Jan 2012 at 1:19

GoogleCodeExporter commented 9 years ago
This sounds very awesome. I am interested in helping you complete this and 
porting it to C++. 

My perimeter-following algorithm (which reduces self intersecting polygons into 
simple polygons while throwing away "loops") depends on a seg-seg intersect 
routine and I have just been using brute force for that. If I get that cleaned 
up and optimized I'll be happy to incorporate that into poly2tri so that 
poly2tri can triangulate any complex polygon shape. 

Original comment by stevenlu...@gmail.com on 4 Jan 2012 at 1:25

GoogleCodeExporter commented 9 years ago
"so that poly2tri can triangulate any complex polygon shape"

That was my goal to and to include boolean operations between two or more 
polygons. Strayed of and started playing around with OpenGL :). 

Feels like it's time to finally finish this. Will break out the intersection 
code from the code base and make it standalone and easier to work with.

Original comment by thahlen@gmail.com on 4 Jan 2012 at 1:46

GoogleCodeExporter commented 9 years ago
I have heard good things about Clipper http://angusj.com/delphi/clipper.php

Since this is also a permissive license there isn't quite so much a need for 
adding boolean ops to p2t. 

What's interesting though is that it uses (64 bit) integers. 

Original comment by stevenlu...@gmail.com on 4 Jan 2012 at 1:55