akuukka / quickhull

C++ implementation of the 3D QuickHull algorithm
265 stars 51 forks source link

Testing Various Cases, and Potential fix #24

Open Kushal-Shah-03 opened 2 months ago

Kushal-Shah-03 commented 2 months ago

Based on https://github.com/akuukka/quickhull/issues/23, I have been experimenting with potential fixes, and have created a draft PR, so you can review some changes I have made. @elalish and @pca006132 , you can go through the PR as well, it has some commented conditions I tried, they might give you an idea of what I explored, and the current fix I have pushed, appears to perform significantly better on the dataset

akuukka commented 2 months ago

Normalizing is not a good idea, generally speaking. When the point cloud vectors are very big (e.g. all points' norms more than 100000000) or very small (e.g. all points' norms are less than 0.000001), then it will cause a great deal of additional numerical inaccuracy.

It may help in some special cases, especially when the vector components are not too small or big, but more often that not, it only makes things more unstable and of course there is always a performance hit.

pca006132 commented 2 months ago

I think with floating point, as long as you are not doing addition/subtraction, this should not lead to numerical issues? Taking sqrt lose precision, but it is independent of magnitude of the coordinate.

In fact, I recall when I was taking some computational geometry class, the professor said we should always scale and translate all points to [-1, 1] to avoid catastrophic cancellation caused by, e.g. adding points with large magnitude with points near the origin.

And looking at the code it seems that it does require normalization somewhere, e.g. QuickHull.hpp line 462 as mentioned above, otherwise these distances are not comparable.

(sorry for the bad formatting, I am on my phone now)

akuukka commented 2 months ago

True, we're not doing addition/subtraction, so it shouldn't cause too much issues, but the very first versions of my quickhull algorithm indeed did use normalization and in my experience it always caused trouble (more failures to solve the horizon edge loop).

starseeker commented 2 months ago

When the point cloud vectors are very big (e.g. all points' norms more than 100000000) or very small (e.g. all points' norms are less than 0.000001), then it will cause a great deal of additional numerical inaccuracy.

Do you happen to know of any test cases from your initial work that triggered those problems we could use? Or should some be created to check behavior in such cases?

akuukka commented 2 months ago

Unfortunately, it was 10 years ago, so nope...

elalish commented 2 months ago

@akuukka we're thinking of forking quickhull and pulling it into Manifold, to make it easier to parallelize and ensure our CI is testing it across our various platforms. I suppose this is fine given it's marked as Public Domain. Still, I'd like to give you credit - is there anything you'd appreciate we include in our Apache 2.0 license header?

akuukka commented 2 months ago

is there anything you'd appreciate we include in our Apache 2.0 license header?

Not particularly, just go ahead. I'm happy to hear that you are finding quickhull useful.