Closed FrieseWoudloper closed 4 years ago
Hi @FrieseWoudloper, thanks for all the issues! I changed fairness_check_metrics()
to explicitly stating metrics c('ACC', 'TPR', 'PPV', 'FPR', 'STP')
, I think it will be helpful. I know that these metric names abbreviations are sometimes difficult to decipher so now when a user has to state some metric name, the documentation will point to the fairness_check
function documentation, where every abbreviation is explained. I also did changes according to your other issues (#26 , #27, #28). Thanks for using fairmodels
.
It was difficult for me to figure out valid values for the
fairness_metrics
argument for the functionmetric_scores
. The Usage section of the help says:However,
fairness_check_metrics
is a helper function that is not exposed. So I don't think it should be mentioned.Also the Arguments section says:
However, the plot doesn't give the exact valid argument values.
From the Examples section:
The meaning of TPR and ACC I could easily guess, but for me it wasn't immediately clear what STP stands for.
So maybe it would be better to explicitly state all valid values, including a full description?