fabsig / KTBoost

A Python package which implements several boosting algorithms with different combinations of base learners, optimization algorithms, and loss functions.
Other
60 stars 19 forks source link

Is it possible to add a monotone constraint? #7

Open flippercy opened 4 years ago

flippercy commented 4 years ago

Hi Mr. Sigrist:

Thank you for implementing such a cool algorithm! I am wondering whether it is possible to add a monotone constraint to the main function. This is crucial for problems such as credit scoring for which domain knowledge is important and supported by most major implementations of boosting models such as Xgboost and LightGBM.

Happy boosting!

Sincerely,

Yu Cao

fabsig commented 4 years ago

Thank you for bringing this up, Yu! This would for sure be helpful for credit scoring and also other areas.

Given that the solution done by XGBoost and LightGBM is relatively simple (don't allow for splits that are not in line with the constraints, as far as I remember...), this can for sure be implemented. Note that this is also being discussed for scikit-learn, see e.g. https://github.com/scikit-learn/scikit-learn/issues/6656. I.e., we can likely use code / ideas from there.

I currently don't have time to work on this, but any contributions are welcome.