dmlc / XGBoost.jl

XGBoost Julia Package
Other
288 stars 111 forks source link

Can not reduce verbosity #161

Closed rwarnung closed 1 year ago

rwarnung commented 1 year ago

Could you please tell me how I can reduce verbosity? I have set print_every_n => Int(0) and verbosity=0 but I still get the info for every boosting step. I use the package version 2.2.1 in Julia v1.8.2. There is a newer package version, and I can not upgrade at the moment. However I doubt that this is an issue of the package version.

The following is an example. It always prints train-rmse for each iteration (of the 50 in this example). How can I supress this?

Thank you!

using XGBoost

nobs = Int(10000)
num_feat = Int(20)

x_train = rand(T, nobs, num_feat)
y_train = rand(T, size(x_train, 1))

params_xgb = Dict(
    :max_depth => Int(2),
    :eta => 0.01,
    :objective => "reg:squarederror",
    :print_every_n => Int(0)
)

dtrain = DMatrix(x_train, y_train .- 1)
@time m_xgb = xgboost(dtrain, num_round=50, verbosity=0, param = params_xgb);
pred_xgb = XGBoost.predict(m_xgb, x_train);
size(pred_xgb)
Moelf commented 1 year ago
ExpandingMan commented 1 year ago

I'm somewhat loathe to do more hacking of the default arguments, but this issue seems to have come up so many times I wonder if we should special-case verbosity=0.

rwarnung commented 1 year ago

Thanks @Moelf and @ExpandingMan. This solves the issue. Sorry, that I didn't find it myself.

Moelf commented 1 year ago

@ExpandingMan just need to document it honestly