jhasse / poly2tri

2D constrained Delaunay triangulation library
BSD 3-Clause "New" or "Revised" License
429 stars 89 forks source link

Disable collinearity tolerance #40

Closed roystgnr closed 2 years ago

roystgnr commented 2 years ago

This is a naive way to fix #39 ... but at least for my cases, it works!

When I ran the poly2tri unit test suite, I got a failure ... but it turned out that the "failure" was that it stopped failing in a test that had previously been failing, so I'm going to call that a success instead and change the test to assert so.

I've added a new unit test too, with the original problem from #39 (rewritten to match the coding style of the other tests more closely), so even if it turns out that the Orient2d change isn't the best fix here, we can make sure subsequent fixes don't revert that progress.

With this in place my refinement code seems to be working, so I'll start conjuring up a bunch of test cases via that and see if I can trigger any other bugs that this fix missed or any previously-non-bug cases that this fix regressed.

roystgnr commented 2 years ago

I'll start conjuring up a bunch of test cases via that

I'm definitely still hitting a problem, if you want to wait on merging until I've had more time to investigate. I think this one's more likely to be in my code than in yours, though.

roystgnr commented 2 years ago

That problem was a false alarm. When I was cleaning up my own code's recent changes to strip out the "attempted workaround for poly2tri bug" parts, I accidentally also removed an in-progress fix for one of my own tolerance-related bugs because it looked similar. Restoring that fix and getting rid of my own excessive tolerance fixed the problem for most cases, then switching to a more robust test in my code fixed the problem for every case I've tried, and I can now use this branch when refining triangulations into the millions-of-elements range (far past what my users need in 2D; their element count explodes when extruding meshes into 3D afterwards) without problems.

roystgnr commented 2 years ago

And non-uniform refinement works too. I'm having a lot of fun with this.

adaptive_tri_trans

I think non-uniform anisotropic refinement is going to be where it gets scary. The above mesh is still Delaunay in the Euclidean metric, so except for this little bugfix I didn't have to touch poly2tri internals. But if I want thin boundary layer elements that track one boundary orientation in one area and a different boundary orientation in another, that'll be a different story. I may just wrap things up for the year at this point; this is all the functionality my users need right away. If I get ambitious, though... how open are you to much-more-intrusive PRs?

roystgnr commented 2 years ago

Is there anything else I can do to push for getting this merged? I'm now developing using a forked branch downstream, and I was hoping to be able to switch that back to a hash on your master branch before users start relying on it.

I admit I'm not thrilled with this PR either; ideally solutions to hairy FP precision issues ought to be backed up via "mathematical proof in the docs", not "more passes in the test suite". But "more passes in the test suite" is still a big step up from "fewer passes in the test suite".

jhasse commented 2 years ago

I agree.

Let's merge this and see if someone complains.

jhasse commented 2 years ago

Forgot to mention: Thank you very much! Especially for the detailed comments.

roystgnr commented 2 years ago

You're welcome!

Let's merge this and see if someone complains.

There's probably a Russell conjugation here: "He breaks the code base; you add new features too hastily; I encourage collaborators to write more test coverage."

Tag me in any issues that come up if someone finds a regression? The short-term solution might still be "tack on a revert commit" (you never rewrite history on master, right?) if a more longstanding user's needs conflict, but I'd definitely want to help make sure that the long-term solution was "find a fix that works for both issues" rather than "maintain a code fork" or "get stuck at an old commit".

jhasse commented 2 years ago

Yes, I would not force push. So your hash on this repo will always be valid :)