Updating and adding plots for imbalanced intro slides. Adding benchmark to imbalanced intro slides. Adding benchmark R file.
Why are there differences between the ppv calculated by mlr3 measures and the confusion matrix? The tpr is basically the same but the ppv is not. Here is the output in R:
ppv from aggregate is 0.4959300. Why is there a difference? tpr and accuracy correspond to the confusion matrix and we have set up the positive class as 1 in the code review as well.
I have not changed any of the calculations except changing cv to threefold to avoid NANs in the plots. Is that okay?
Updating and adding plots for imbalanced intro slides. Adding benchmark to imbalanced intro slides. Adding benchmark R file.
Why are there differences between the ppv calculated by mlr3 measures and the confusion matrix? The tpr is basically the same but the ppv is not. Here is the output in R:
Example: take nr 12:
ppv from aggregate is 0.4959300. Why is there a difference? tpr and accuracy correspond to the confusion matrix and we have set up the positive class as 1 in the code review as well.
I have not changed any of the calculations except changing cv to threefold to avoid NANs in the plots. Is that okay?