stanfordmlgroup / ngboost

Natural Gradient Boosting for Probabilistic Prediction
Apache License 2.0
1.62k stars 214 forks source link

Discrete explanatory variables #351

Closed patrickocal closed 4 months ago

patrickocal commented 4 months ago

Before I dive in and start using this codebase, I have the following question. In the paper, it suggests that, although we have freedom over the domain of the dependent variable, we have $x \in R^n$. So am I to interpret this as meaning that $x$ is required to be a real-valued random vector? That is, discrete explanatory variables are excluded?

Just a quick yes/no would be much appreciated as the the author (of the paper) is taking a while to answer.

patrickocal commented 4 months ago

To try and answer my own question: I can't see why we need $x$ to be real-valued, since the natural gradient pertains to the dual, parameter space. So, are we to assume that one-hot representations of discrete/categorical variables are in? I can see that we have one-hot representations here: https://github.com/stanfordmlgroup/ngboost/blob/38c2245c400a69b8885e08c63a38dcf4224267cc/examples/experiments/survival_exp.py#L73

If so, then I suggest changing the paper as it is misleading.

patrickocal commented 4 months ago

Okay, so the minimum working example answers this: there are discrete (if not categorical variables) in X.

alejandroschuler commented 4 months ago

Yes, discrete features are supported. $x \in \mathbb R^p$ is not misleading because, for example, $x \in \{0,1\} \subset \mathbb R^p$

patrickocal commented 4 months ago

Okay, the fact that ${0, 1} $ is a subset of the reals is not the point, @alejandroschuler @avati : obviously. It is the statement $x\in \mathbb R$ alongside the language (in the paper) used to describe the dependent variable: why do you bother stating $y\in { 0, 1} $? This is what makes the paper misleading: to mathematician anyway. Take it as an opportunity to improve your paper. I found Friedman’s omnireg much clearer.

alejandroschuler commented 4 months ago

🙄