MFEK / math.rlib

A library which supplies mathematics and algorithms for manipulating beziers.
Apache License 2.0
7 stars 4 forks source link

Collinear handles must remain collinear when stroked #15

Closed simoncozens closed 2 years ago

simoncozens commented 2 years ago

Here's the top of a circle. Notice that there is a smooth join (G1 continuity) in the Bezier curve between the two segments as the handles are collinear:

Screenshot 2021-10-13 at 13 59 12

Here's the current MFEK stroke output on that curve:

Screenshot 2021-10-13 at 14 00 22

Notice that the handles are no longer collinear, leading to a bumpy join (discontinuity):

Screenshot 2021-10-13 at 14 01 15

Our circle has become pointy!

Correct output should be:

Screenshot 2021-10-13 at 14 01 50 Screenshot 2021-10-13 at 14 02 05
MatthewBlanchard commented 2 years ago

I think I have an angle for fixing this. I'll get back to you on this.

ctrlcctrlv commented 2 years ago

Thanks for turning this into a real issue, just further goes to show it's not only a spot of OCD from me, but an issue real users notice…we've known about it for a while…I considered writing scripts for my own fonts to force handles colinear if they are "almost colinear", but will wait and see what Matt comes up with.

simoncozens commented 2 years ago

I think I have an angle

Seriously, man? :-p

simoncozens commented 2 years ago

It’s not just OCD. Things that should be G1 (or even G2) but aren’t are a big deal in font work.

MatthewBlanchard commented 2 years ago

So after some testing this is actually unrelated to the idea I had that could be causing it. I rewrote a pretty significant portion of the curve offsetting and ended up with the same result and sadness. I think this might be due to the innate inaccuracy of offsetting, sampling, and then fitting the curves.

There's a few ways of approaching this. First and most crudely you can subdivide the curves and get more sample points which helps with this issue. Second, and I think more promising would be to look at the underlying curves, see how colinear their handles are, and then 'nudge' the output handles until they are as close to colinear as the input handles.

If you have any literature on balancing curves I would really appreciate it. I'm going to look more into this tomorrow after the meeting and hopefully have something before I start my weekend.

simoncozens commented 2 years ago

So, first I would say that if the input is smooth (G1) but the output is not smooth, there might be something wrong with the algorithm. I can't prove it, and there may be circumstances where either the outside or the inside curves would end up non-smooth, but it just... feels wrong.

Second, the fact that simple noodles are coming out with non-balanced handles also feels like there's something wrong:

Screenshot 2021-10-15 at 08 32 46

(Not only does it feel like there's something wrong, the consequence of this is that when two noodles meet at a 90° join, they get slightly different curves:

Screenshot 2021-10-15 at 08 34 35

This buggers up overlap removal processing.)

But I don't know if there actually is anything wrong with the algorithm, or if that's the best that can be done. If it is the best that can be done, here's what I would do to mitigate the problem.

Then do your stroke. Now:

incoming_length = on_curve.length(incoming_handle)
outgoing_length = on_curve.length(outgoing_handle)

new_incoming_angle = 0.5 * ( on_curve.angle(incoming_handle) + on_curve.angle(outgoing_handle).rotate(180) )
new_outgoing_angle = 0.5 * ( on_curve.angle(outgoing_handle) + on_curve.angle(incoming_handle).rotate(180) )

incoming_handle.set_position(on_curve + (Vector(1,1) * incoming_length).rotate(new_incoming_angle))
outgoing_handle.set_position(on_curve + (Vector(1,1) * outgoing_length).rotate(new_outgoing_angle))

Because of the unequal handle length issue above, I would also suggest balancing the handles in all curve segments. The balance operation goes like this. For this, we are not looking at the incoming/outgoing handles of a join, but across a segment:

Screenshot 2021-10-15 at 09 02 43
if c0.distanceFrom(p) == 0.0:
  fraction1 = 0.43
else:
  fraction1 = c0.distanceFrom(c1) / c0.distanceFrom(p)
if c3.distanceFrom(p) == 0.0:
  fraction2 = 0.73
else:
  fraction2 = c3.distanceFrom(c2) / c3.distanceFrom(p)
avg = (fraction2 + fraction1) / 2.0
if avg > 0 and avg < 1:
  c1 = c0.lerp(p, avg)
  c2 = c3.lerp(p, avg)

I can't remember if you need to balance before forcing G2 continuity as well, but you definitely need to do it at the end.

ctrlcctrlv commented 2 years ago

@simoncozens I have good news, I have perhaps figured this out. I believe that the problem stems from flo_curves. I noticed something very suspicious while reading the code, it seems to be using the quadratic version of the curve fitting function?

One thing I always found suspicious from this library is how few points it gives in comparison to @skef's work. Perhaps we now know why.

Considering the input: image

MFEKstroke would normally give this bad, non-smooth result: image

I however achieved via a patch something much more pleasing: image image

Simple patch:

diff --git a/src/bezier/offset_lms.rs b/src/bezier/offset_lms.rs
index 28794a6..1762925 100644
--- a/src/bezier/offset_lms.rs
+++ b/src/bezier/offset_lms.rs
@@ -76,5 +76,5 @@ where   Curve:          BezierCurveFactory+NormalCurve,
         .collect::<Vec<_>>();

     // Generate a curve using the sample points
-    fit_curve(&sample_points, max_error)
-}
\ No newline at end of file
+    Some(fit_curve_cubic(&sample_points, &curve.tangent_at_pos(0.0), &curve.tangent_at_pos(1.0), max_error))
+}

What does @Logicalshift think?

The beauty is increased across the board: image

I think @Logicalshift made a mistake and @MatthewBlanchard didn't catch it is all.

skef commented 2 years ago

It's almost inherent to offsetting that you know the tangent angle at the offset point -- this is true even of "general" offsetting. In that light it makes the most sense to "clamp" the spline ends at those angles and fit points in between. Then if the best fit has a point outside of the tolerance you can subdivide somewhere and try again, clamping the angles at the intermediate point because you know the tangent there as well.

The revised circle above does look better but the point distribution doesn't make much sense -- not sure what would be causing that.

I've recently realized that it's probably unnecessary to do any cubic "fitting" for offset curves, even in the general case, and that one can instead generate points only to evaluate accuracy for potential splitting. This is because when offsetting -- even "general" offsetting -- one also knows the curvature at each offset point so you can just solve for that at each end instead. I haven't implemented that solution though. The main advantage is that you'll get curvature matching at all points that are supposed to be same-curvature.

ctrlcctrlv commented 2 years ago

I think it's also worth pondering why we're even using the offset_lms_sampling (lms = least-mean-square).

The other flo_curves function, offset, uses a scaling algorithm. It cautions that it might produce bad results if the initial and final offsets are very different, but that's only relevant to VWS. We may want to consider splitting CWS with more intention from VWS, perhaps even adding a flag for the flo_curves algorithm used (at the MFEKstroke level, which is MFEK/math.rlib's main consumer for the moment, although MFEKglif is a graphical consumer, it lags behind and doesn't support everything MFEKstroke does).

Logicalshift commented 2 years ago

I think you're right about the fix but wrong about the problem: the problem is probably that fit_curves estimates the tangents at the start and end of the set of points (and once every 100 points), which will produce a subtly incorrect answer: using the tangents from the original curve will produce a much better answer.

The extra points being generated seems to be a bug either in the curve fitting or error estimating algorithm. After fitting the curve, flo_curves subdivides at the point that has the highest error, but with something simple like a circular arc the error should be low enough after fitting that no subdivisions are necessary. Here it's subdividing towards the end which I think is compensating for some sort of issue with the algorithm (not sure if it's the actual fitting algorithm or the error algorithm at this point)

The scaling offset algorithm was problematic in the past but I rewrote it alongside the LMS algorithm with a much more reliable approach so it should also be a good option. It does tend to subdivide too many times at the moment so that might be another downside to consider (the LMS algorithm is better at this even with the bug I described above).

skef commented 2 years ago

With a fixed-sized nib the tangent at any offset point should be the same as the tangent on the "source" spline, which is 90 degrees from the normal already calculated to do the offset in the first place. Estimation (beyond the accuracy of floating-point calculations) shouldn't be necessary.

I think the same is true for variable-size nibs -- the curvature at the offset point changes but the instantaneous tangent angle is the same -- because otherwise the offset point calculation would be incoherent. However, I've only thought about variable nibs a little bit so I could be mistaken.

ctrlcctrlv commented 2 years ago

(I've thought about it more and I actually think that variable size nibs really are not a useful feature, what is actually useful is nibs that change their angle as they transverse the path. Right now a nib always has a set angle as it marches the path, one could imagine instead the angle being the normal of the point, or user-defined at each point.)

skef commented 2 years ago

Variable-sized nibs are just a generalization of variable-width stroking, so they're useful for "Power Stroke"-like support.

Variable angle nibs are possible but they open a huge can of worms. All sorts of things become much more difficult (he most obvious being inflection-point madness).

skef commented 2 years ago

Non-smooth variable angle nibs (such as rectangular nibs) are (probably) harder than smooth variable-angle nibs. When using a nib with a point it's very important to know when a given point starts "drawing the offset" on one side. That depends on the angles incident to the point and the angle of the source curve. If the former are fixed you can just attend to the angle of the latter. If they're changing you have simultaneous equations to solve for every trace.

ctrlcctrlv commented 2 years ago

I think we could survive with ovals in the beginning.

skef commented 2 years ago

Well, this is close to a request for full MetaFont support in a vector environment, so the history of why MetaPost doesn't support it, or just the fact that it doesn't support it after all these years, is instructive.

Variable-size nibs have a potential problem that you don't often see in practice and can potentially ignore, which is that they can "ink" areas with edge-portions other than those tangent to the direction of stroking. Think of a line being inked by a circle growing faster than its rate of travel -- some areas "behind" the nib will be inked by the back edge. One can "address" this problem by changing the metaphor: say by having the nib be a thin line dragged along the source contour and staying normal to it, but whether that "works" depends on the effect one wants.

From my (admittedly limited) research into the rotation problem, rotating nibs suffer from this much more fundamentally. There are common cases in which part of what one is looking for in a rotating nib involves inking areas with non-tangent points on the nib edge. And once you need that you need an entirely different approach.

If I were tasked with implementing this I would start by despairing. Then I would probably pursue a hybrid vector and raster-trace system where certain points were noted as "sharp" with known angles to improve the tracing process. Then I would hope I could identify all, or enough, of those points.

As it happens that's a vaguely accurate description of the old FontForge algorithm. Maybe GW went down that path in the hope of eventually supporting things like rotating nibs, perhaps limited to the polygonal case. Unfortunately it didn't manage to pick out all the needed points even in the non-rotating case, so thinking it could be nudged to the general solution seems wishful.

Logicalshift commented 2 years ago

I've just released v0.5.2 of flo_curves which has an improved version of the fitting algorithm. I've also changed the offset algorithm to use the tangents from the original curve as suggested above, and improved the documentation for fit_curve_cubic - the reason for the extra points in the example above was that the end tangent was pointing in the wrong direction, so it was trying to find a curve that reversed direction.

Finally, I modified the fitting algorithm to reparameterize at least once before splitting the curve: this should reduce the number of curves generated and subdivide them at better locations.

ctrlcctrlv commented 2 years ago

Much obliged @Logicalshift. I will push your release into MFEK's branch (I don't know why Matt made an MFEK branch so will first try not using it).

We've had some contact over the years, and FYI I noticed over on your bugtracker Matt left a few solved bugs open. You're quite likely to want to close those—Matt retired from the MFEK project to pursue video game development. You're back to dealing with me I'm afraid. ;-)

ctrlcctrlv commented 2 years ago

Update: I have compared MFEK/flo_curves:master to Logicalshift/flo_curves:master and determined that two small patches could eliminate the need for the former, which I have opened as Logicalshift/flo_curves#10 and Logicalshift/flo_curves#11. Assuming these are both merged, MFEK/flo_curves can be done away with, and upstream flo_curves depended upon once again by us.

ctrlcctrlv commented 2 years ago

@simoncozens Here's the output of MFEKstroke on the circle given the latest Logicalshift flo_curves pushes when compiled against 9619a79:

image

simoncozens commented 2 years ago

Much nicer!