Open PGijsbers opened 5 years ago
Resolved in 0de5e1b4b71e8ac5ef9c079a7e0bb3bbefd17aac. Only accepts custom metrics through creating your own Metric. I will probably keep it that way. Before closing this issue it needs to be decided if this is the way custom metrics will be supported. If so, an example needs to be added to the documentation before this issue gets closed.
The score
function needs to be adapted to return multiple scores if more than one optimization score was given, or needs to clearly state it will only return one.
With mixed metrics, currently only predict_proba
is called, and the class predictions it taken to be the class with highest probability. Instead, predict
should also be called and those predictions should be used for class-label metrics.
Currently the way to specify optimization metrics is a little messy. There is the scikit-learn-like
scoring
hyperparameter to specify multi-object optimization towards the specified metric and pipeline length. There is theobjectives
hyperparameter which let's you specify more than one optimization metric, and there is theoptimize_strategy
which let's you specify minimization and/or maximization.This should be unified into one clear way to set metric(s) to optimize towards.