When the best performing model is returned from grid/random search and it is evaluated on the test set, a user might want to retrain it on a whole dataset with the same hyperparameters. Currently, one would have to inspect what hyperparameters were selected and this is problematic in some cases, esp. in a Pipeline where types of the original transformers and the predictor are lost. For that reason, we want to expose the .unfit() method, which would create an unfitted estimator with the same hyperparameters.
Example usage:
val split = splitData(x, y)
val selectedModel = gridSearch(split.xTr, split.yTr)
val score = f1Score(split.yTe, selectedModel.predict(split.xTe))
val finalModel = selectedModel.unfit().fit(x, y)
When the best performing model is returned from grid/random search and it is evaluated on the test set, a user might want to retrain it on a whole dataset with the same hyperparameters. Currently, one would have to inspect what hyperparameters were selected and this is problematic in some cases, esp. in a
Pipeline
where types of the original transformers and the predictor are lost. For that reason, we want to expose the.unfit()
method, which would create an unfitted estimator with the same hyperparameters.Example usage: