mathandy / svgpathtools

A collection of tools for manipulating and analyzing SVG Path objects and Bezier curves.
MIT License
557 stars 142 forks source link

Vectorised Curve Evaluation #127

Open estennw opened 4 years ago

estennw commented 4 years ago

Are there any implementations of / plans to implement vectorized point evaluation (and evaluation of the normal, unit_tangent inv_arclength etc.) of Path objects? Accepting numpy arrays in particular would be desirable for data analysis applications. If one could call

path.point(np.linspace(0, 1, 101))

with a vectorised backend, this could potentially reduce computation time significantly compared to

[path.point(t) for t in np.linspace(0, 1, 101)].
abey79 commented 4 years ago

CC @tatarize

Funny you should ask just now, I'm precisely looking for, and working towards, the same. You will probably be interested in this (admittedly very lengthy) read: meerk40t/svgelements#52.

If you would like to monkey patch your version of point(), here it is for Arc.point():

def _arc_point_parallel(arc, t):
    out = np.empty(t.shape, dtype=complex)

    angle = np.radians(arc.theta + t * arc.delta)
    cosphi = arc.rot_matrix.real
    sinphi = arc.rot_matrix.imag
    rx = arc.radius.real
    ry = arc.radius.imag

    out.real = rx * cosphi * np.cos(angle) - ry * sinphi * np.sin(angle) + arc.center.real
    out.imag = rx * sinphi * np.cos(angle) + ry * cosphi * np.sin(angle) + arc.center.imag

    out[t == 0] = arc.start
    out[t == 1] = arc.end
    return out

Also note that QuadraticBezier.point() will already support a numpy array input, because of how simple its implementation is:

    def point(self, t):
        """returns the coordinates of the Bezier curve evaluated at t."""
        return (1 - t)**2*self.start + 2*(1 - t)*t*self.control + t**2*self.end

I don't have anything yet for CubicBezier.point() but I'm working on it.

tatarize commented 4 years ago

Wouldn't CubicBezier.point() just be the same sort of trivial QuadraticBezier.point() code in equally trivial fashion? Avoid Horner's rule on this and you've got it?

    def point(self, t):
        return (1-t)**3*self.start + 3*t*(1-t)**2*self.control1 + 3*(1-t)*t**2*self.control2 + t**3*self.end
abey79 commented 4 years ago

Oh well, you'r quite right. The current implementation might well be parallelised already:

    def point(self, t):
        """Evaluate the cubic Bezier curve at t using Horner's rule."""
        # algebraically equivalent to
        # P0*(1-t)**3 + 3*P1*t*(1-t)**2 + 3*P2*(1-t)*t**2 + P3*t**3
        # for (P0, P1, P2, P3) = self.bpoints()
        return self.start + t*(
            3*(self.control1 - self.start) + t*(
                3*(self.start + self.control2) - 6*self.control1 + t*(
                    -self.start + 3*(self.control1 - self.control2) + self.end
                )))

Edit: this one says it's using Horner's rule. Not sure what the implications are.

tatarize commented 4 years ago

It's doing some extra addition and subtraction to avoid the squaring and cubing. From my knowledge of numpy that should just work. Those same operations get applied to everything in the array and nothing needs to mess with anything else. In theory it should do the same as the default formula but without the squaring. It's the monomial form of an degree-n polynomial. I'd say it's likely parallelised already. I'm pretty sure all those operations work with a numpy array. It's just the same formula simplified by a bit. I might hit it with a timing routine to make sure it's faster since you're referencing the values a bit a bit more to reduce the polynomial form but squaring a number isn't a hefty operation.

tatarize commented 3 years ago

@estennw you're apparently right, abey did some fantastic work with meerk40t/svgelements#52 which lead to 1.3.x branch of the svgelements code. Processing the points request as path.npoint(np.linspace(0, 1, 101)) is massively faster. (point() is non-vectorized and wraps all responses in Point objects and points interfered with Polygon.points and Polyline.points which are spec defined). I don't process the other math things. I'm not sure how needed they are, and I don't know if they easily isolate to facilitate borrowing them.

mathandy commented 3 years ago

I wish (though keep reading). If I could go back in time, svgpathtools would use numpy much more ubiquitously and take advantage of vectorization in many more places.

I just pushed a commit. Regarding path segments (not paths), the point() method now support numpy arrays. Thanks to @abey79 for pointing out how simple this was to do for arcs.

The derivative, normal(), and unit_tangent() methods (for segments only, not paths) now also support numpy arrays.

This all was pretty much for free, only requiring minor changes. Path.point() still does not support numpy arrays. inv_arclength() is another very worthwhile target for vectorization IMHO.

I also changed all the math and cmath imports in path.py to be imported from numpy in hopes of encouraging vector-friendly PRs in the future. That may mean a small performance hit, so if anyone comes across this comments and thinks it lead to a more than small performance hit for your workload, please let me know.

Regarding vectorizing Path.point(): I implemented a vectorized version (well it's vector-friendly let's say, I don't think it takes full advantage of the vectorizability of the task). I didn't merge it as it's mostly untested (it looks like some unit tests might be failing, but I haven't checked why). I'd appreciate input/thoughts from any interested party that has opinions/thoughts on the matter.

Here's a link to the branch. (Or pip install with pip install git+https://github.com/mathandy/svgpathtools.git@vectorize-path-point)

Here's a link to the changed code.

I may or may not make improvements to this or merge this branch -- I'm hoping someone else can take it from here or help out with integrating this new function (i.e. with testing or with rewriting in a way that makes it unlikely break any code that depends on svgpathtools). Also, there's still significant room for improvement in its implementation I think.

tatarize commented 3 years ago

@mathandy Since you updated the Travis Token all tests have been failing. Even my fairly harmless collection of bug fixes has failure. Even though I tested them locally and they don't fail tests, they are failing tests that were not failing before, and I added in coverage for changes I made to demonstrate they fix things that were genuinely broken.

mathandy commented 3 years ago

@tatarize thanks for bringing this up -- that was pretty sloppy of me. Switching math.sqrt to numpy.sqrt in path.py caused the problem (which was related to numpy.sqrt returning an nan where math.sqrt had raised a ValueError). I also fixed an old bug in one of the tests. All tests should be back to passing now. (hopefully travis-ci agrees)