SheffieldML / GPyOpt

Gaussian Process Optimization using GPy
BSD 3-Clause "New" or "Revised" License
928 stars 261 forks source link

Cannot use RF model type #188

Open vmarkovtsev opened 6 years ago

vmarkovtsev commented 6 years ago

The following code crashes for me:

opt = GPyOpt.methods.BayesianOptimization(f=fit_rules,
                                          model_type="RF",
                                          domain=domain, acquisition_type="LCB",
                                          acquisition_weight=0.2, num_cores=4)
opt.run_optimization(max_iter=50)
opt.plot_convergence()
/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in run_optimization(self, max_iter, max_time, eps, context, verbosity, save_models_parameters, report_file, evaluations_file, models_file)
    135             # --- Update model
    136             try:
--> 137                 self._update_model(self.normalization_type)
    138             except np.linalg.linalg.LinAlgError:
    139                 break

/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in _update_model(self, normalization_type)
    254 
    255         # Save parameters of the model
--> 256         self._save_model_parameter_values()
    257 
    258     def _save_model_parameter_values(self):

/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in _save_model_parameter_values(self)
    258     def _save_model_parameter_values(self):
    259         if self.model_parameters_iterations is None:
--> 260             self.model_parameters_iterations = self.model.get_model_parameters()
    261         else:
    262             self.model_parameters_iterations = np.vstack((self.model_parameters_iterations,self.model.get_model_parameters()))

AttributeError: 'RFModel' object has no attribute 'get_model_parameters'

The default model type works fine.

apaleyes commented 6 years ago

Yeah, our random forest implementation is a bit behind. PRs are welcome!

Jingfei-Liu commented 5 years ago

I also met this question "'RFModel' object has no attribute 'get_model_parameters'", what can I do to implent this model? Can you fix it, and give a new release?

Jingfei-Liu commented 5 years ago

The following code crashes for me:

opt = GPyOpt.methods.BayesianOptimization(f=fit_rules,
                                          model_type="RF",
                                          domain=domain, acquisition_type="LCB",
                                          acquisition_weight=0.2, num_cores=4)
opt.run_optimization(max_iter=50)
opt.plot_convergence()
/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in run_optimization(self, max_iter, max_time, eps, context, verbosity, save_models_parameters, report_file, evaluations_file, models_file)
    135             # --- Update model
    136             try:
--> 137                 self._update_model(self.normalization_type)
    138             except np.linalg.linalg.LinAlgError:
    139                 break

/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in _update_model(self, normalization_type)
    254 
    255         # Save parameters of the model
--> 256         self._save_model_parameter_values()
    257 
    258     def _save_model_parameter_values(self):

/usr/local/lib/python3.7/dist-packages/GPyOpt/core/bo.py in _save_model_parameter_values(self)
    258     def _save_model_parameter_values(self):
    259         if self.model_parameters_iterations is None:
--> 260             self.model_parameters_iterations = self.model.get_model_parameters()
    261         else:
    262             self.model_parameters_iterations = np.vstack((self.model_parameters_iterations,self.model.get_model_parameters()))

AttributeError: 'RFModel' object has no attribute 'get_model_parameters'

The default model type works fine.

Have you solved this problem? Can you share you experience? Thanks!

bywords commented 5 years ago

I came across the same error. Is this still unresolved yet?

apaleyes commented 5 years ago

No, that isn't fixed. I don't think we'll have time to work on it any time soon. But, like mentioned above, if someone puts an effort in to do a PR we are happy to merge that.