Open jaredlander opened 5 years ago
Hi @juliasilge , as I wrote via e-mail, it could be useful to implement an xgboost model for regression with a linear booster. My target to predict derive from count or can be observed in a percentage way, I think it's better to use an ensemble model with a linear booster to test different distribution. My needs are to change the loss function because my target is skewness distributed, to manage it would be amazing if I will use linear booster parameters. thanks have a nice day MC
I would love to see a linear booster option added as well, thus enabling tuning for the three parameters listed with the linear booster in the xgboost package reference manual.
{xgboost}
can fit boosted, penalized linear models by settingbooster="gblinear"
. This would be a great addition tolinear_reg()
, or perhaps there can be aboost_linear_reg()
function and it can also includegbm()
.Making this addition can probably take advantage of existing
xgb.train()
implementation because it only requires changingbooster
then using thelambda
lambda_bias
andalpha
parameters consistently with thepenalty
andmixture
arguments.This then also brings up the question of fitting pseudo random forests with
{xgboost}
. Should this be added torand_forest()
or should thenum_parallel_tree
argument just be used when callingboost_tree()
?