Closed browshanravan closed 4 years ago
Unfortunately, it would be non-trivial to implement the desired behavior in the current Pipelinehelper.
I suggest that you add the parameter var_smoothing
to the grid search parameters and use the cv_results
field of the grid object for detailed numbers. The validation_curve
method is more like a scaled-down version of grid search, and I'm not sure why you want to perform it separately afterwards.
Otherwise, for using validation_curve
the way you describe, you must know which model is set in the best_estimator_
field (otherwise, you can't know which parameter to test). This however means that you don't need the Pipelinehelper any longer.
Thats fair enough. I was thinking more along the lines of plotting how a change in a particular parameter influences the validation curve, given all other parameters being optimised, but it is not critical to my pipeline. Many thanks.
I think that the key point is that all other parameters might no longer be optimal once you change a different parameter. Otherwise, one wouldn't need a grid search but optimize all parameters one after another.
Agreed! great point :) 👍
Apologies, but I wasn't sure if this qualified as a new issue or not.
So, when I use my example code and I put the output of
grid.best_estimator_.get_params()
intovalidation_curve()
method and set theparam_name
, I get one of the following errors, whatever combination I try.Code
Error output for
param_name = 'clf__selected_model__var_smoothing',
Error output for
param_name = 'selected_model__var_smoothing',