ModelOriented / fairmodels

Flexible tool for bias detection, visualization, and mitigation
https://fairmodels.drwhy.ai/
GNU General Public License v3.0
86 stars 15 forks source link

Difficulty figuring out valid values for fairness_metrics argument in metric_scores function #29

Closed FrieseWoudloper closed 4 years ago

FrieseWoudloper commented 4 years ago

It was difficult for me to figure out valid values for the fairness_metrics argument for the function metric_scores. The Usage section of the help says:

metric_scores(x, fairness_metrics = fairness_check_metrics())

However, fairness_check_metrics is a helper function that is not exposed. So I don't think it should be mentioned.

Also the Arguments section says:

fairness_metrics character, vector with fairness metric names. Default metrics are ones in fairness_check plot

However, the plot doesn't give the exact valid argument values.

From the Examples section:

ms <- metric_scores(fobject, fairness_metrics = c("TPR","STP","ACC"))

The meaning of TPR and ACC I could easily guess, but for me it wasn't immediately clear what STP stands for.

So maybe it would be better to explicitly state all valid values, including a full description?

jakwisn commented 4 years ago

Hi @FrieseWoudloper, thanks for all the issues! I changed fairness_check_metrics() to explicitly stating metrics c('ACC', 'TPR', 'PPV', 'FPR', 'STP'), I think it will be helpful. I know that these metric names abbreviations are sometimes difficult to decipher so now when a user has to state some metric name, the documentation will point to the fairness_check function documentation, where every abbreviation is explained. I also did changes according to your other issues (#26 , #27, #28). Thanks for using fairmodels.