Open psaks opened 4 years ago
It doesn't seem that feature importances change. Using "lightgbm==2.3.0" I get the following;
`xval, yval = make_classification(n_samples = 1000, n_features=10) model = lgb.LGBMClassifier(n_estimators=100, learning_rate = 0.05, verbose = -1)
for i in range(10): model.fit(xval, yval) print(model.featureimportances) [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648]`
If this is "correct" LightGBM behaviour, then there is obviously no need to average the featureimportances over multiple iterations.
It doesn't seem that feature importances change. Using "lightgbm==2.3.0" I get the following;
`xval, yval = make_classification(n_samples = 1000, n_features=10) model = lgb.LGBMClassifier(n_estimators=100, learning_rate = 0.05, verbose = -1)
for i in range(10): model.fit(xval, yval) print(model.featureimportances) [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648] [244 537 213 214 183 222 282 264 175 648]`
If this is "correct" LightGBM behaviour, then there is obviously no need to average the featureimportances over multiple iterations.