ogrisel / pygbm

Experimental Gradient Boosting Machines in Python with numba.
MIT License
183 stars 32 forks source link

did you stopped development since since can not do better than lightGBM pr Xgboost pr catboost? #96

Closed Sandy4321 closed 4 years ago

Sandy4321 commented 4 years ago

did you stopped development since since can not do better than lightGBM pr Xgboost pr catboost?

NicolasHug commented 4 years ago

Not really, development is stalled here because we have been focusing on the scikit-learn implementation (in Cython, not in Numba), which is about as fast as other libraries in most cases.

ogrisel commented 4 years ago

The scope of this project is not to be faster than highly optimized libraries like LightGBM and XGBoost but to be almost as fast while significantly simpler in terms of code (high level Python accelerated by numba vs low level C++ for the other two libraries).

ogrisel commented 4 years ago

The Cython implementation in scikit-learn is a bit more verbose that pygbm but does not require to add numba as an extra dependency to scikit-learn for the time being.

ogrisel commented 4 years ago

BTW, if you are interested in the numba version (that is pygbm), there are a couple of improvements implemented in the Cython version in scikit-learn that could be ported here:

https://github.com/ogrisel/pygbm/labels/help%20wanted

Feel free to contribute a PR if you are interested.