Closed jpandeinge closed 3 years ago
I solved it, I defined the parameters in the params
variable wrongly. And changed 'params` from
params = {'estimator__linearregression__fit_intercept': ['True', 'False'],
'estimator__linearregression__normalize' : ['True', 'False']}
to (below) by removing estimator__
in front of every base models used.
params = {'linearregression__fit_intercept': ['True', 'False'],
'linearregression__normalize' : ['True', 'False']}
I had a query about whether stacking regressor supports a
sklearn
GrindSearchCV() where I use an algorithm for hyperparameter tuning for optimization. Sample code:I believe the error is from the
params
variable that I defined, I just don't seem to get it since I tried to implement all theparameters
for all the regressors that I got using thegrid.get_params().keys()
.However, the above code leads to an error below;
Is there a way to tune all the parameters for all the regressors used in order to have the best optimal ones and try it on a new model? Is there a way to retain them since I would like to do a hyper parameter search?