Open jxu opened 11 months ago
You're correct that there's not an option to show that xgboost output!
A reprex:
library(tidymodels)
bt_mod <-
boost_tree(mode = "regression") %>%
set_engine("xgboost", verbose = 1)
bt_mod_tune <-
boost_tree(mode = "regression", loss_reduction = tune()) %>%
set_engine("xgboost", verbose = 1)
bt_fit <- fit(bt_mod, Sale_Price ~ ., ames)
#> [1] training-rmse:140957.756219
#> [2] training-rmse:101521.900279
#> [3] training-rmse:74031.152726
#> [4] training-rmse:54589.486659
#> [5] training-rmse:41193.492679
#> [6] training-rmse:31989.714402
#> [7] training-rmse:25764.773577
#> [8] training-rmse:21675.651543
#> [9] training-rmse:18897.830517
#> [10] training-rmse:16933.370379
#> [11] training-rmse:15584.564895
#> [12] training-rmse:14633.219683
#> [13] training-rmse:14098.681762
#> [14] training-rmse:13556.029486
#> [15] training-rmse:13215.934462
bt_res <-
tune_grid(
bt_mod_tune,
Sale_Price ~ .,
resamples = bootstraps(ames, 3),
grid = 3,
control = control_grid(verbose = T),
metrics = metric_set(rmse)
)
#> i Bootstrap1: preprocessor 1/1
#> ✓ Bootstrap1: preprocessor 1/1
#> i Bootstrap1: preprocessor 1/1, model 1/3
#> ✓ Bootstrap1: preprocessor 1/1, model 1/3
#> i Bootstrap1: preprocessor 1/1, model 1/3 (extracts)
#> i Bootstrap1: preprocessor 1/1, model 1/3 (predictions)
#> i Bootstrap1: preprocessor 1/1, model 2/3
#> ✓ Bootstrap1: preprocessor 1/1, model 2/3
#> i Bootstrap1: preprocessor 1/1, model 2/3 (extracts)
#> i Bootstrap1: preprocessor 1/1, model 2/3 (predictions)
#> i Bootstrap1: preprocessor 1/1, model 3/3
#> ✓ Bootstrap1: preprocessor 1/1, model 3/3
#> i Bootstrap1: preprocessor 1/1, model 3/3 (extracts)
#> i Bootstrap1: preprocessor 1/1, model 3/3 (predictions)
#> i Bootstrap2: preprocessor 1/1
#> ✓ Bootstrap2: preprocessor 1/1
#> i Bootstrap2: preprocessor 1/1, model 1/3
#> ✓ Bootstrap2: preprocessor 1/1, model 1/3
#> i Bootstrap2: preprocessor 1/1, model 1/3 (extracts)
#> i Bootstrap2: preprocessor 1/1, model 1/3 (predictions)
#> i Bootstrap2: preprocessor 1/1, model 2/3
#> ✓ Bootstrap2: preprocessor 1/1, model 2/3
#> i Bootstrap2: preprocessor 1/1, model 2/3 (extracts)
#> i Bootstrap2: preprocessor 1/1, model 2/3 (predictions)
#> i Bootstrap2: preprocessor 1/1, model 3/3
#> ✓ Bootstrap2: preprocessor 1/1, model 3/3
#> i Bootstrap2: preprocessor 1/1, model 3/3 (extracts)
#> i Bootstrap2: preprocessor 1/1, model 3/3 (predictions)
#> i Bootstrap3: preprocessor 1/1
#> ✓ Bootstrap3: preprocessor 1/1
#> i Bootstrap3: preprocessor 1/1, model 1/3
#> ✓ Bootstrap3: preprocessor 1/1, model 1/3
#> i Bootstrap3: preprocessor 1/1, model 1/3 (extracts)
#> i Bootstrap3: preprocessor 1/1, model 1/3 (predictions)
#> i Bootstrap3: preprocessor 1/1, model 2/3
#> ✓ Bootstrap3: preprocessor 1/1, model 2/3
#> i Bootstrap3: preprocessor 1/1, model 2/3 (extracts)
#> i Bootstrap3: preprocessor 1/1, model 2/3 (predictions)
#> i Bootstrap3: preprocessor 1/1, model 3/3
#> ✓ Bootstrap3: preprocessor 1/1, model 3/3
#> i Bootstrap3: preprocessor 1/1, model 3/3 (extracts)
#> i Bootstrap3: preprocessor 1/1, model 3/3 (predictions)
Created on 2024-01-26 with reprex v2.1.0
There's a parsnip control argument that controls it. We should probably add that argument to downstream control functions (I don't think there is a way to pass the parsnip control through).
Normally with xgboost
fit(..., verbose=1)
, I get a line for each tree and the evaluation metric. Withtune_grid(control=control_grid(verbose=T))
, I get messages for each model tried, but not the xgboost output. Is there a more verbose option?Here's my config