tidymodels / probably

Tools for post-processing class probability estimates
https://probably.tidymodels.org/
Other
115 stars 15 forks source link

Specify metrics in threshold_perf() #37

Closed szego closed 1 year ago

szego commented 3 years ago

Description

So that the user can specify which metrics are computed when calling threshold_perf(), we add a new metrics argument that accepts a yardstick metric set.

By default this argument is NULL, in which case the function computes the default metrics. If not NULL, the function checks that it's an appropriate metric set (class metrics only) and computes only the metrics provided in the set.

Motivation and Context

Fixes #25

How Has This Been Tested?

Tested using yardstick 0.0.7 on R 4.0.2.

Types of changes

Checklist:

Note that I did not add an example in the docs of threshold_perf() that uses the new feature.

szego commented 3 years ago

Forgot to note that the code that verifies the passed metric set (lines 106 to 117 in threshold_perf.R) incorporates some code from tune::check_metrics(). See lines 305 to 310 here: https://github.com/tidymodels/tune/blob/a97576f3f3e36f362388de7e2f3ef2df8ab3a38f/R/checks.R#L305

topepo commented 1 year ago

Sorry for the long delay on this.

Since we were working on similar problems related to calibration I made a separate branch and PR to solve this (but credit you in the news file).

Thanks for getting this going!

github-actions[bot] commented 1 year ago

This pull request has been automatically locked. If you believe you have found a related problem, please file a new issue (with a reprex: https://reprex.tidyverse.org) and link to this issue.