Closed ecurtiss closed 5 months ago
Alternatively, a lerp-based algorithm like de Casteljau or Seiler interpolation would permit storing half the data that we currently store at the cost of slower computations.
I wrote some horribly unreadable algorithms to reduce duplicate computations. I also included some special cases:
Parameter | Value | Parametrization name |
---|---|---|
alpha | 0 | uniform |
alpha | 0.5 | centripetal |
alpha | 1 | chordal |
tension | 1 | taut |
The following benchmarks show that the new methods are about 13-17% faster, with taut being ~65% faster. We also see that the perf improvements in chordal and centripetal decrease with the number of interpolants, so their value is questionable given the extra code paths that they add. As a result, I will nix chordal, but I will keep centripetal since it is the default.
10 interpolants per spline, 100 instantiations per benchmark
100 interpolants per spline, 10 instantiations per benchmark
Completely merged in 4a501aba8d4f54a3f5aa6f1c118ff05dcf2dabf7.
When parametrizing in
Spline.new()
, many of the vector subtractions and vector magnitudes can be reused across instantiations. This should reduce parametrization time significantly.