Open brishtibheja opened 1 month ago
Which version did you test it on?
anki 24.06.03 ver.
Do you still encounter this problem in the latest beta?
I cannot check that. Can you with my deck?
I can't reproduce the big drop, but I can reproduce something similar. (365 days) w[6]=0.9999, MRR=0.87 w[6]=0.9998, MRR=0.89 w[6]=0.9990, MRR=0.87
I tried that range for another one of my decks. In this deck, try changing w[6] from .8890
to .8880
. The big drop happens in .8884
.
Nope, not on the latest beta. It's just consistently 0.89.
https://github.com/user-attachments/assets/3b55ba61-6e9a-4a5d-a5fa-9fcc05184958
This concerns how the MRR value changes as w[6] is changed. I think this is weird behaviour and demands an explanation.
In one of my presets, when w[6] is
.8885
the MRR value is shown as.87
. If you decrease that to.8884
the MRR value drops to.78
which is a drop of 10 percentage points. This looked a bit weird as I was playing around with w[6] and it barely did anything with CMRR; I had been slowly decreasing w[6] from.9000
. I'd have expected to see slow change.From
.8884
, decreasing w[6] further does nothing to MRR until it's reduced to.8879
where MRR moves back to.87
. Note that, the RMSE/log loss values stays exactly the same as I decrease w[6] from.8885
(log loss=.3869
, RMSE=6.52%
). I have experimented with different values, but haven't found any patterns here.Here is the deck:export.zip (rename
export.zip
→export.apkg
)Also reproduced this with another one of my decks although it's less drastic change in MRR (drops from
.83
to.76
as w[6] is increased from.9997
to.9998
). Log-loss/RMSE values stays the same.On a tangent, as CMRR outputs a workload:knowledge value, can we test this (or have we already?) in the 20k dataset. You can run CMRR using half of the reviews and check how accurate the simulation is for the rest of the collection.