Open david-cortes opened 8 months ago
Thank you for raising the issue.
It's a really old warning that predates my involvement. From my understanding, initially, xgboost tried to support having a sequence of updaters as you shared in the snippet, but in practice, it's not quite working for most of the cases since the updater/optimizer has internal states to keep track of things like prediction cache. Then, the prune
updater is not quite useful during training, since not growing the tree is the same as (if not better than) pruning a tree.
We kept the parameter for gbtree
, but recommend people to use tree_method instead.
I will try to elevate the rest of the updaters into tree methods.
If one passes parameter
updater
, xgboost will throw a warning abouttree_method
being ignored, even if said parameter is not passed. The warning occurs in both python and R, and happens only the first time thatxgb.train
with said parameter is executed.Subsequent executions of the same call to
xgb.train
do not produce the warning twice.