stanfordmlgroup / ngboost

Natural Gradient Boosting for Probabilistic Prediction
Apache License 2.0
1.64k stars 215 forks source link

Sympy #219

Open guyko81 opened 3 years ago

guyko81 commented 3 years ago

Adding new distribution: Gamma Plus providing 2 working examples with Sympy:

This way the development can speed up significantly and a wider community can contribute.

ryan-wolbeck commented 3 years ago

@guyko81 in order to get the build to function properly you'll have to modify https://github.com/stanfordmlgroup/ngboost/blob/master/pyproject.toml instead of the requirements.txt

guyko81 commented 3 years ago

@ryan-wolbeck I can't figure out what's the deal with the failed checks. Nothing on Google that would help. make: *** [lint] Error 1 Makefile:12: recipe for target 'lint' failed And why 2 of the commits have a red X next to them? Test fail is noted there?

What I can tell is that the code runs nicely, and as far as I can tell the found models are the same as before.

alejandroschuler commented 3 years ago

Thanks for providing this example! I don't think it quite fits the bill of what I'd like for the design, though. It seems like the developer would need to understand/use sympy in order to implement new distributions. More importantly, they still need to understand the idea of differentiating the score of the distribution w.r.t. the parameters and the difference between the internal and user-facing parametrizations. These are straightforward concepts to folks with a lot of experience and formal education but I expect that to many users it is a stretch. That's why the current work on the jax branch has it set up so that the user doesn't even need to implement most of the methods in order to get a working prototype.