Closed poroc300 closed 3 years ago
Many thanks for your helpful feedback!
It is true that when loading a model from file, the parameters are not loaded into Python again. The parameters are loaded in the corresponding C++ object, but they are not visible in Python. This is a feature that is inherited from LightGBM. In general, I try not to deviate too much from the LightGBM implementation concerning the tree-boosting part. Given that LightGBM does not support this and it is not a very important feature, I do not plan to add this feature. Note that the parameters are saved in the corresponding model file. But for easier use, it is probably better to save the parameters separately.
Another thing: your code is currently ignoring the gp_model
, and I guess that this is not your intention. This is due to the fact that the line
gpb_model = gpb.GPModel(group_data=group).set_optim_params(params={"optimizer_cov": "gradient_descent"})
returns None
(since set_optim_params
returns nothing). I will change this such that in future releases of GPBoost, this works correctly. For now, you need to use two lines of code:
gpb_model = gpb.GPModel(group_data=group)
gpb_model.set_optim_params(params={"optimizer_cov": "gradient_descent"})
Many thanks for the thorough response and the advice to set up optimal parameters.
First of all thank you for all the nice improvements added in the last package update.
I have performed a grid search optimization approach to determine optimal parameters for analysis. After I have found them, I trained a model with those parameters and saved the model to a json file. When I load the model, however, I cannot get the parameters used to train the model. Not sure if I am using the wrong attributes, and I also know this is just a minor issue. I can save independently another file with a list of parameters but just thought it would be handier to access them through the loaded model itself.
Please find below a snippet of code to replicate my problem. I am using Spyder 5.0.0 with Python 3.8 on Windows 10. Thank you very much.