Closed rconstanzo closed 9 months ago
Actually reading through all the metrics, they all read like temporary text. It's a lot of abbreviations and jargon. I honestly don't know what any of the metrics do or understand what the text is trying to communicate.
A half-sentence on each would go a long way to say what each metric is potentially useful for.
An example:
1 = '1'
HFC thresholds on (sum of (squared magnitudes * binNum) / nBins)
I only know this stands for High Frequency Content because the attrui
in Max says as much. Beyond that, I don't know how important or relevant it is to know what/how it's computed, vs what it does. Something like "this is useful for measuring differences in high frequency content in a way that is different from metric ABC for reasons XYZ".
Similarly, metrics 6-9 (nice) are not only similarly vague but also really confusing to read through. Like if I want to know what metric 9 is doing, I have to reverse-read through the previous metrics like I'm going through Infinite Jest or something.
I would offer some more useful text/copy as suggestions to PR with, but I actually don't know what the metrics do, and what they are useful for.
nice catch. @MattS6464 is on a roll for this, so might find inspiration for this before the end of his internship, otherwise I'll do it when I have a minute.
thanks!
God I hate this new text format entry fields... not everything requires the same kind of "bug reporting form" layout (such as this).
But basically the reference files in Max (haven't checked other platforms, hence posting it in this repo) have this listed for metric 4:
I imagine at this point it will no longer be removed(?) so best to remove that text.
FluCoMa Version
1.0.6