benjamminf / warpjs

Warp, distort, bend, twist and smudge your SVG’s directly in the browser
https://benjamminf.github.io/warpjs
MIT License
486 stars 34 forks source link

Extrapolation algorithm produces "jagged" curves #1

Closed benjamminf closed 7 years ago

benjamminf commented 8 years ago

Currently the algorithm only averages control points across two pieces. This results in the unfortunate case where curves become "flatter" than they should be, and have the potential to lead to some jagged lines with certain distortion algorithms.

This isn't exactly a problem when dealing with the visual positions of the points. But it is a problem when it comes to figuring out rest positions.

There are two ways this may be able to be fixed.

  1. Improve the approximation. Use a smarter algorithm instead of linear average of two control points.
  2. Recalculate rest positions based off the original segment. The original segment exists in some form even after interpolating (thanks to the FixedPoint class), so it can be re-interpolated. The tricky part would be figuring out the "t" values. It'll need to cut the original segment in two spots - the start of the joined segment and the end of the joined segment. So the tricky part becomes how to work out the "t" value for a set of coordinates that fall along the bezier curve. (Note that (2) will only apply to rest positions and not distorted positions - this is okay as it's these resting positions that are causing the issue)
benjamminf commented 8 years ago

In regards to idea (2), I overlooked something obvious - the "t" value is already known. Interpolation is done with the fixed value 0.5. Reversing the equation is now trivial.

benjamminf commented 8 years ago

quad-halving

P1 = (2)P01 - P0

benjamminf commented 8 years ago

halving

P1 = (2)P01 - P0 P2 = (2)P23 - P3

benjamminf commented 8 years ago

Turns out I'm partially wrong. The implementation didn't work 100%, and that is because the assumption that t=0.5 is actually incorrect.

Joining two pieces that have just been split indeed as a t value of 0.5, however after these are joined back together, it might also be joined with another pieces that was split from somewhere else, which would end up with a t value of 0.33.

It may be possible track this t value by saving what division the piece is from the overall segment. These division values will be compared when joining pieces, so if both pieces report the same division value (for example 1/4 and 1/4) then their t value will be 0.5. If they differ, then the t value will be different (for example 1/4 and 3/4 would have a t value of 0.33).

If we take D0 to be the division value of the first piece N0, and D1 to be the division value of the second piece N1, then t = D0 / (D0 + D1)

benjamminf commented 8 years ago

In order to implement the above division tracking, fractional values need to be stored and summed. Due to the potential magnitude of the fractions (considering we're interpolating at a pretty high resolution) floating point inaccuracies will likely come into play.

If instead the fractions can be stored as their integer numerator/denominator pairs, then this will eliminate inaccuracies.

benjamminf commented 8 years ago

After working on this all day, I've concluded that it's just not possible to reliably implement this. I got it working reliably with quadratic curves, but due to the inaccuracies with floating point numbers it still produced these flattened edges. Cubic beziers were another beast altogether than I just couldn't get playing nicely with the algorithm.

I don't think this is possible to implement (it's a shame, as it's pretty important) so I'm hoping there will be some other insight to solve this problem eventually.

benjamminf commented 8 years ago

Another idea may be to restrict joining/extrapolating pieces to only pairs of pieces that were once part of an actual piece, rather than just joining any pieces possible. Then using this restriction, save pieces prior to splitting/interpolating and store them on a stack. This way we're not approximating points when joining.

One major downside to this is memory consumption would grow at an exponential rate. It may just be that it won't be significant enough to worry about, but it won't be known until it is tried.