-
**Describe the issue**
It is a miss in your benchmarking. I think you should look at the "time-series-bakeoff" for others like it.
DTW is a pain, and very well known and well used method.
The …
-
I can't grow basic trees without having specialised seed generators, which take special trees, basically capping my factory, I'm pretty sure you could have basic tree seed generators before? This seem…
-
It may be possible to use non gradient-based techniques like `fmin_powell`. See (http://www.scipy-lectures.org/advanced/mathematical_optimization/index.html#a-shooting-method-the-powell-algorithm)
-
### Describe the workflow you want to enable
The histogram gradient boosted decision trees usually do not fulfil the so called *balance property* on the training data, i.e. `sum([proba]predictions) =…
-
[Enter feedback here]
This is an important feature however there are several different ways one could come to a conclusion to early stop the training.
At the moment the training iteration goes …
-
The Criteo dataset is a 1TB dump of features around advertisements and whether or not someone clicked on the ad. It has a both dense and categorical/sparse data. I believe that the data is freely av…
-
Microsoft pre-released the Explainable Boosting Machine 2 weeks ago:
https://github.com/microsoft/interpret
It has very promising performance profile
| Dataset/AUROC | Domain | Logistic Regr…
-
Hello,
I have the following problem: I noticed that when using the new version (1.2.0) of Tensorflow Decision Forests, if I pickle a trained model (gradient boosted trees), then load it and run the…
-
I'm aware of the extended discussion in [#50](https://github.com/rstudio/bundle/issues/50) but I was still somewhat surprised that an xgboost model object after bundling & unbundling returns garbled v…
-
```
There are two issues:
1) setting the seed does not ensure reproducibility of the model
2) when no predictors are used in any splits, the predicted values are somewhat
inconsistent; sometimes NA …