noinia / hgeometry

HGeometry is a library for computing with geometric objects in Haskell. It defines basic geometric types and primitives, and it implements some geometric data structures and algorithms. The main two focusses are: (1) Strong type safety, and (2) implementations of geometric algorithms and data structures that have good asymptotic running time guarantees.
121 stars 40 forks source link

Comprehensive feature wishlist / brain dump. #50

Open lemmih opened 3 years ago

lemmih commented 3 years ago

Algorithms:

Infrastructure:

Test properties:

Library:

Branding:

Incredible overview of Computation Geometry: Visibility Algorithms In The Plane

Another good overview: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.48.2190&rep=rep1&type=pdf

Overview of intersection algorithms: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.63.2041&rep=rep1&type=pdf

Handbook of Discrete and Computational Geometry: http://www.csun.edu/~ctoth/Handbook/HDCG3.html

PhD about numerical errors: https://tel.archives-ouvertes.fr/tel-03116750/document

noinia commented 3 years ago

That seems like a pretty nice TODO list :).

With respect to the 'isCounterClockwiseOrder':

edit: Hmm, if we want to enforce that using s.t. like a smart constructor that actually seems to force Eq r and Fractional r constraints on us. That does not seem very nice.

Somewhat related to this. I've actually been wondering if I should have used an Array/Vector to store the vertices instead of an (Circular) Sequence. That would allow us O(1) time indexing (instead of the O(log n) we currently have). We do "lose" persistence. But my impression is that most of the time we use polygons in a read-only setting anyway. Any opinions on this?

With respect to approximate convex subdivion:

I'll try to implement/add something like that this weekend :).

noinia commented 3 years ago

For prosterity let me also add here: I've been working on implementing an O(n log n) time algorithm to compute the convex hull of a set of points in R^3. This would also give O(n log n) time algorithms for e.g. Voronoi diagram and Delaunay Triangulation. The implementation in hgeometry-devel seems to work, but it doesn't really like degenerate inputs (i.e. 4 coplanar points).

To better deal with degeneracies I've (partially) implemented 'Simulation of Simplicity': a generic way of dealing with degeneracies. I haven't fully integrated that with the rest of the library yet. (Essentially, this mechanism would replace some of the basic primitives that we are currently using, e.g. to check if 3 points make a CCW or CW turn, to compute intersections etc). When I have some free time again I hope to continue that work.

lemmih commented 3 years ago

With respect to the 'isCounterClockwiseOrder':

* I wonder if we should just 'normalize' the orientation of the vertices. i.e. keep as an invariant that the vertices are always in, say CCW, order.

edit: Hmm, if we want to enforce that using s.t. like a smart constructor that actually seems to force Eq r and Fractional r constraints on us. That does not seem very nice.

We could simplify the constraint to Eq r and Num r. Would that be acceptable? I'm having a hard time thinking of polygons with points that aren't numbers and do not have equality tests. Maintaining a CCW invariant on all polygons would be a big win from my point of view.

lemmih commented 3 years ago

Somewhat related to this. I've actually been wondering if I should have used an Array/Vector to store the vertices instead of an (Circular) Sequence. That would allow us O(1) time indexing (instead of the O(log n) we currently have). We do "lose" persistence. But my impression is that most of the time we use polygons in a read-only setting anyway. Any opinions on this?

Completely in favor of this. I can draft a PR so we can see if there are unexpected consequences.

noinia commented 3 years ago

We could simplify the constraint to Eq r and Num r. Would that be acceptable? I'm having a hard time thinking of polygons with points that aren't numbers and do not have equality tests. Maintaining a CCW invariant on all polygons would be a big win from my point of view.

Ah well spotted that we can simplify the constraint to (Num r, Eq r) instead. I think I can indeed live with those constraints, and I think having a CCW invariant would indeed be worth that.

To support O(1) indexing in polygons with holes we also need to store slightly more information about holes (some cumulative number of vertices on all holes 'up to hole h'). Therefore, I think we should refactor things to make SimplePolygon/Polygon an opaque type, and have smart constructors that guarantee those constraints (and some pattern synonyms to get some sort of backward compatibility).

lemmih commented 3 years ago

To support O(1) indexing in polygons with holes we also need to store slightly more information about holes (some cumulative number of vertices on all holes 'up to hole h'). Therefore, I think we should refactor things to make SimplePolygon/Polygon an opaque type, and have smart constructors that guarantee those constraints (and some pattern synonyms to get some sort of backward compatibility).

Do we need O(1) indexing of all vertices? I would have thought that O(1) indexing of the boundary vertices and for any given hole would be sufficient for all purposes.

lemmih commented 3 years ago

There is also a fairly simple O(n log n) time sweepline algorithm that even works in polygons with holes (rotate a half-line around the query point ; maintain the edges intersected by the line in a BST) that would be nice to implement/add :)

Tetsuo Asano's algorithm, right? I've looked and looked and I cannot find a copy of his 1983 article. Is his algorithm described in detail somewhere online?

noinia commented 3 years ago

To support O(1) indexing in polygons with holes we also need to store slightly more information about holes (some cumulative number of vertices on all holes 'up to hole h'). Therefore, I think we should refactor things to make SimplePolygon/Polygon an opaque type, and have smart constructors that guarantee those constraints (and some pattern synonyms to get some sort of backward compatibility).

Do we need O(1) indexing of all vertices? I would have thought that O(1) indexing of the boundary vertices and for any given hole would be sufficient for all purposes.

Well, in the current sitatuation if you have a polygon with many, say \Omega(n) holes, and you ask for "give me information about vertex i" which is some vertex of a hole (but you don't really know which one) that would take O(n) time. With slightly more information we can make that O(1). That seems like a useful thing to have. Especially if we are currently fiddling with the definition of Polygon anyway.

noinia commented 3 years ago

There is also a fairly simple O(n log n) time sweepline algorithm that even works in polygons with holes (rotate a half-line around the query point ; maintain the edges intersected by the line in a BST) that would be nice to implement/add :)

Tetsuo Asano's algorithm, right? I've looked and looked and I cannot find a copy of his 1983 article. Is his algorithm described in detail somewhere online?

Yes. It's a very basic sweepline algorithm, so it appears in basic textbooks about computational geometry, e.g. Computational Geometry: Theory and Applications, but probably not in other printed articles.

noinia commented 3 years ago

There is also a fairly simple O(n log n) time sweepline algorithm that even works in polygons with holes (rotate a half-line around the query point ; maintain the edges intersected by the line in a BST) that would be nice to implement/add :)

Tetsuo Asano's algorithm, right? I've looked and looked and I cannot find a copy of his 1983 article. Is his algorithm described in detail somewhere online?

Yes. It's a very basic sweepline algorithm, so it appears in basic textbooks about computational geometry, e.g. Computational Geometry: Theory and Applications, but probably not in other printed articles.

Rather than describing the algorithm I just ended up implementing a large part of it just now. See #62

noinia commented 3 years ago

I've opened up a separate issue (#71) to discuss things related to visual unit tests and debugging.

In terms of features I think there are also some low hanging fruits like:

noinia commented 3 years ago

By the way; for visibilityPolygon from edge: me and some collegues are working on a very simple O(n) time algorithm in triangulated polygons. One of our students implemented it in C++, and it seems it is alsmost as fast as visibility-polygon from a point. We should hopefully be able to make that writeup public soon. We can then also implement it in HGeometry :)

lemmih commented 3 years ago

Exciting. HGeometry might get a cutting edge algorithm before CGAL.

lemmih commented 3 years ago

In terms of features I think there are also some low hanging fruits like:

* [ ]  add a concrete implementation of a segment tree storing 2D line segments (currently there is some generic SegmentTree module supporting monoidal annotations. Should be fairly easy to get that into something actually workable)

* [ ]  planar point location. Either using the segment tree or by a persistent sweep. I think I even partially implemented the planar sweep in some file in the repo already.

I don't understand what these two mean. Do you have links to papers or articles that explain the algorithms/problems?

noinia commented 3 years ago

In terms of features I think there are also some low hanging fruits like:

* [ ]  add a concrete implementation of a segment tree storing 2D line segments (currently there is some generic SegmentTree module supporting monoidal annotations. Should be fairly easy to get that into something actually workable)

* [ ]  planar point location. Either using the segment tree or by a persistent sweep. I think I even partially implemented the planar sweep in some file in the repo already.

I don't understand what these two mean. Do you have links to papers or articles that explain the algorithms/problems?

For the first one; segment trees solve the following problem: given a set of disjoint segments in the plane, store them in a data structure such that given a vertical query segment Q, we can quickly report the segments intersected by Q. See for example slide 27-and onwards of these slides. Segment trees use O(n log n) space, and support such queries in O(log^2 n + k) time. (or even O(log n) using some additional tricks)

For the second one: The problem is given a planar subdivison, store it such that given an arbitrary query point q \in R^2, we can find the face containing q. This problem is equivalent to vertical ray shooting: i.e. given q find the first edge vertically above q. (our planar Subdivision data structure then allows us to "retrieve" the face from that edge/line segment in constant time.)

To solve vertical ray shooting there are two options:

a) store the edges (= line segments) in a segment tree. (The segment-tree can report this edge in O(log n) time.) b) plane sweep with a partially-persistent BST: sweep a vertical line over the plane, while maintaining the edges that are currently intersected by the vertical line (in the order in which the segments intersect the sweep line). To answer a query: find (using a binary search) the version of the BST (a Data.Set) that corresponds to the "time" at which the sweep line was at x-coordinate q_x, now do another binary search to find the segment directly above q. It seems I already partially implemented this in https://github.com/noinia/hgeometry/blob/master/hgeometry/src/Algorithms/Geometry/PointLocation/PersistentSweep.hs

More low-hanging fruit would be Voronoi diagrams; since we already have the delaunay triangulation, we can convert it into a Voronoi diagram in linear time. I also already partially implemented that in https://github.com/noinia/hgeometry/blob/master/hgeometry/src/Algorithms/Geometry/VoronoiDiagram/DualDT.hs Too many interesting things to think about/try that I sometimes end up starting on something and not finishing it properly.

mrehayden1 commented 3 months ago

Here's an updated link to the Greiner-Hormann paper: https://www.inf.usi.ch/hormann/papers/Greiner.1998.ECO.pdf