Every time frequency changed, the remain part of last frequeny lost_Δt ≤ frequency_duration%(1/sample_frequency) will be dropped, in example graph which is between last frequency black sample dot (t<2) and frequeny change point (t=2).
Code Time Function: Brown
Real Mathematica Function: Blue
I think the real next sample value should be sampled at the red star,
and the lost part should be very minor so that it's usually hard to distinguish the effect.
Cause
The time lost happens on these codeMode.java, especially the force (int) cause lost
Thank you for pointing it out.
We already suspected this during implementation, but we went ahead because there wasn't a major problem.
It would be nice if you could improve it and create a pull request.
Problem
Every time frequency changed, the remain part of last frequeny
lost_Δt ≤ frequency_duration%(1/sample_frequency)
will be dropped, in example graph which is between last frequency black sample dot (t<2) and frequeny change point (t=2).I think the real next sample value should be sampled at the red star, and the lost part should be very minor so that it's usually hard to distinguish the effect.
Cause
The time lost happens on these code
Mode.java
, especially the force(int)
cause lost