mixxxdj / mixxx

Mixxx is Free DJ software that gives you everything you need to perform live mixes.
http://mixxx.org
Other
4.49k stars 1.28k forks source link

increase BPM tap filter length #10010

Open mixxxbot opened 2 years ago

mixxxbot commented 2 years ago

Reported by: ronso0 Date: 2020-06-09T13:05:30Z Status: Confirmed Importance: Low Launchpad Issue: lp1882776 Tags: bpm, easy, usability Attachments: [Screenshot 2020-09-07 at 11.13.56.png](https://bugs.launchpad.net/bugs/1882776/+attachment/5408282/+files/Screenshot 2020-09-07 at 11.13.56.png)


Currently the filter sample list is limited to 5. If we increase that to let's say 16 or more the detected BPM would get much more accurate and hitting cur_pos then would create a somewhat usable beat grid on the pretty quickly.

Code is here: https://github.com/mixxxdj/mixxx/blob/master/src/engine/controls/bpmcontrol.cpp#L33

mixxxbot commented 2 years ago

Commented by: ronso0 Date: 2020-06-24T20:32:54Z


It seems a bit more complicated that I first thought. The accuracy doesn't improve with a longer filter list, which may be due to the nature of the InterQuartileMean filter.

I think for long filter lists like 8+ we get better results if we divide the time between the first and last tap and divide it by (number of taps)-1, instead of dropping values considered invalid like the IQM does. (assuming the user taps every beat)

So to improve tap results we'd need

What do you think?

mixxxbot commented 2 years ago

Commented by: ronso0 Date: 2020-07-13T13:02:43Z


Any comments on the proposed change?

mixxxbot commented 2 years ago

Commented by: ferranpujolcamins Date: 2020-08-20T21:06:27Z


The interquartile mean effectively takes into account only half of the data points you give to it. From this point of view and assuming a data set with no outliers, interquartile mean needs twice as much data points to get the same accuracy than the regular mean. But with enough data points, it is just as accurate.

The regular mean is very sensitive to outliers. In this case, if you miss one single tap, you almost surely get the bpm wrong. Even if you have bad timing in one or two taps. This is the problem that interquartile mean solves.

If increasing its lenght alone doesn’t improve the behaviour of the filter, I’d try to drop fewer data points, i.e. generalizing the current interquartile mean to a truncated mean where we can choose a discard threshold other than the quartile.

mixxxbot commented 2 years ago

Commented by: ronso0 Date: 2020-08-21T15:56:57Z


Hmm, 'regular filter' is probably not the correct term. Maybe my proposal wasn't clear enough, sorry. I mean BPM = (taps-1)/(tap duration) and this is just a simple division, don't know if that can be called a filter.

Tapping along a 120 BPM track for ~30s gives ~60 beats. We take the total time and the total amount of taps. Outliers don't matter as it's not each interval that matters, only the first and last tap that need to be somewhat on-beat. The longer I tap the more the result would approximate 120 BPM. I never managed to get close to the actual BPM with the IQM filter, it floats around 120 with the same (im)precision no matter how long I tap.

mixxxbot commented 2 years ago

Commented by: ferranpujolcamins Date: 2020-09-07T09:06:34Z


Your proposed method is equivalent to the mean of the time intervals between taps.

However, your comment made me realize something. If you think about the taps as normally distributed time intervals (the time intervals between adjacent taps), this means that whenever you tap one beat too late, all subsequent beats will also be too late.

I think the right model in this case is to think of taps as a normally distributed error with respect to the ideal beat position. This means that each tap accuracy is independent of the other taps. If an outlier happens and for example you make one tap too late, the next tap will still be somewhat on time and the time interval between them will be smaller, thus compensating for the longer time interval of the previous tap. I think this models reality better.

From this point of view, the IQM might not be the right tool here. I'm not sure about the regular mean either, because it is over-sensitive on the accuracy of the first and last beat: It doesn't matter how good you tap the beats in between, if you are not accurate on either the first or the last, the bpm will be off.

Maybe we can draw some code from the constant bpm analyzer to solve this.

mixxxbot commented 2 years ago

Commented by: ferranpujolcamins Date: 2020-09-07T09:17:37Z Attachments: [Screenshot 2020-09-07 at 11.13.56.png](https://bugs.launchpad.net/mixxx/+bug/1882776/+attachment/5408282/+files/Screenshot 2020-09-07 at 11.13.56.png)


This picture describes the two models I exposed above.

The top figure represents the model where the time intervals are normally distributed. You can see that when an outlier happens, the next beats are off, because intervals are independently distributed in this model, thus the error is not compensated.

In the second figure, we see that when an outlier happens, the next interval is shorter, because here what is independently distributed is not the interval, but the deviation of each beat from the ideal.

It is clear that in the second model (the right one) there's no point on dropping outliers, because they are compensated.