-
This code gives an **AUC of 0.7417 in 12.1s for the 1M training set on an r3.8xlarge EC2 instance** with the latest release of Datacratic's Machine Learning Database (MLDB), available at http://mldb.a…
-
Hi @jssalonen, I just wanted to update you a bit on what I've learned from these slow processes, and maybe see if you want to try this semi-solution.
I've been running randomForest over the last 11 d…
-
@tqchen @hetong007 I'm trying to get a good AUC with boosting for the largest dataset (n = 10M). Would be nice to beat random forests :)
So far I did some basic grid search https://github.com/szilard…
-
What if at every split we just sample two items and compute the hyperplane that separates them?
For angular distance you would normalize the two items first. This would also generalize to arbitrary p…
-
*Issue migrated from trac ticket # 2317*
**milestone:** HeuristicLab 3.3.12 | **component:** General | **priority:** medium | **resolution:** done
#### 2015-02-13 14:27:21: @Shabbafru created the is…
-
When running a gbm model on a fairly modest dataset (15k observations and 31 predictors) I noticed that my computer basically ran out of memory during training. Also, when the analysis finished after …
wabee updated
8 years ago
-
```
gbm() works great with multinomial outcomes, but gbm.fit() does not. Since
gbm() requires a formula it is less efficient and using gbm.fit() would be
preferred for me (and works for other distri…
-
```
Issue is to do with gbm.fit not returning train.fraction. A related issue has
been described by Elisabeth Freeman:
Thank you for the windows binary.
I have started testing ModelMap with the new…
-
Sent to me by email.
I am testing ‘gbm' on some new data. Using gbm v2.1-05, R 3.0.3 (using the GUI), Max OS 10.9.2. I’ve attached a .rds file with the test data. My response variable is a factor con…
-
Dear Harry,
I am using the ‘dismo’ package to conduct boosted regression trees (BRT) for both binary and count data. The dismo package uses ‘gbm’ package for the implementation of BRT. I would like t…