Closed everdark closed 2 years ago
@everdark Thanks for your report! I think it's very related to #2604 and #2208. Also it'll require something like LGBM_BoosterGetConfig
or adding [out] out_config
argument to the existing functions at cpp side. @guolinke
Closed in favor of being in #2302. We decided to keep all feature requests in one place.
Welcome to contribute this feature! Please re-open this issue (or post a comment if you are not a topic starter) if you are actively working on implementing this feature.
is there any update on this issue?
is there any update on this issue?
@zyxue , thanks for your interest in LightGBM!
If you're interested in working on this feature and contributing, let us know and we'd be happy to answer questions you have.
Otherwise, you can subscribe to notifications on this issue for updates.
hey @jameslamb , I'm interested in giving it a try. Do you have guidance on where to start?
Thanks @zyxue !
I'd start by reading the issues @StrikerRUS mentioned at https://github.com/microsoft/LightGBM/issues/2613#issuecomment-562216072, just to get a better understanding of this part of the code base
Next, I'd add a test to https://github.com/microsoft/LightGBM/blob/da98f24711a2faab17f94e5b2a636e6609c93fa6/tests/python_package_test/test_basic.py using the reproducible example provided by @everdark. That test should fail until your changes are made.
Next, try to work through changes on the C++ side based on @StrikerRUS's statement https://github.com/microsoft/LightGBM/issues/2613#issuecomment-562216072.
it'll require something like
LGBM_BoosterGetConfig
or adding[out] out_config
argument to the existing functions at cpp side
Here's the relevant Python code that's called to create a Booster
from a model .txt
file. Note that it calls LGBM_BoosterCreateFromModelfile()
.
I believe you'll need to create a proposal for extracting the Config_
property from the Booster after it's loaded.
"Config" is the word we use in LightGBM's C++ code to refer to an object that holds all parameters (see e.g. https://github.com/microsoft/LightGBM/pull/4724#pullrequestreview-790133134).
Here's code called by LGBM_BoosterCreateFromModelfile()
which gets parameters from the model text file.
I'll re-open this issue for now since you're planning to work on it. We have a policy in this repo of keeping feature request issues marked "closed" if no one is working on them, so if for any reason you decide not to work on this feature for now, please let me know so we can re-close it.
And if you are interested in contributing but feel that this feature is not right for you, now that you know more about it, let me know what you're looking to work on and I'd be happy to suggest another one. Thanks again for your help!
Thank you @jameslamb for the informative guide! I'll try to get to it.
loaded_parameter_
isn't accessible via Boosting
Class in cpp code right? loaded_parameter_
looks an attribute specific to GBDT only?
Hey @jameslamb , do you have any feedback on my PR above, please? I wonder if that's the right direction for loading back saved the params?
thanks for starting on the work @zyxue ! We will get to reviewing it as soon as possible.
I and a few other maintainers here work on LightGBM in our spare time, so we can sometimes be slow to respond (especially for larger features like this one which require more effort to review). Thanks for your patience.
This issue has been automatically locked since there has not been any recent activity since it was closed. To start a new related discussion, open a new issue at https://github.com/microsoft/LightGBM/issues including a reference to this.
This was locked accidentally. I just unlocked it. We'd still welcome contributions related to this feature!
Environment info
Operating System: Windows 10 (Same result on both Windows and WSL)
CPU/GPU model: Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz
C++/Python/R version: Python 3.7
LightGBM version or commit hash: 2.3.1 installed by pip
Error message
Reproducible examples
Coding example above is directly borrowed from official example
advanced_example.py
I've confirmed the parameters have been written to model file. Here is the trailing of the file:Is this behavior by design? I found this because I'm using
shap
with saved model and it failed to compute shap values due to the fact thatshap
need to accessobjective
in the params, which is gone if the Booster is a pre-trained and re-loaded one.As of now my workaround is to also pass params to Booster when loading:
However I don't think this is a good practice since there is no way to make sure the passed params are consistent with the saved model.