Open F4RZ4D opened 3 years ago
👋 Thanks for opening your first issue here! If you're reporting a 🐞 bug, please make sure you include steps to reproduce it.
@F4RZ4D I've linked this feature request in lightgbm github repo to see if this feature can be added. Thank you for the feature suggestion.
@F4RZ4D the issue I created https://github.com/microsoft/LightGBM/issues/4201 was closed with reply: That's related to a previous issue #3447, which is already in the feature request & voting hub #2302. Contributions are welcome.
Unfortunately it's not supported yet and it needs to be implemented in native lightgbm code. I will leave the issue open here for now.
Is your feature request related to a problem? Please describe. It would be beneficial to many users to be able to perform multiple quantile regression with one model instead of training multiple models with different alphas. Also, multiple quantile regression can be done in a way that quantile-crossing be penalized which makes all fitted quantiles more accurate.
Describe the solution you'd like Multiple quantile regression would allow the user to specify multiple alphas for a LightGBMRegressor and thus have multiple quantile regressions from one model. If the multiple quantile regressions are non-crossing that would make all quantiles more accurate.
Additional context A simple loss function for non-crossing multiple quantile regression can be found here: https://stats.stackexchange.com/questions/249874/the-issue-of-quantile-curves-crossing-each-other