Closed tyokota closed 7 years ago
Yes, you can maximize AUC through Bayesian Optimization, or maximize a negative LogLoss (equivalent to minimizing LogLoss).
Hm... so is it something like below where I switch maximize to False and then take the min score?
xgb_cv_bayes <- function(max.depth, min_child_weight, subsample) {
cv <- xgb.cv(params = list(booster = "gbtree", eta = 0.01,
max_depth = max.depth,
min_child_weight = min_child_weight,
subsample = subsample, colsample_bytree = 0.3,
lambda = 1, alpha = 0,
objective = "binary:logistic",
eval_metric = "auc"),
data = dtrain, nround = 100,
folds = cv_folds, prediction = TRUE, showsd = TRUE,
early.stop.round = 5, maximize = F, verbose = 0)
list(Score = cv$dt[, min(test.auc.mean)],
Pred = cv$pred)
}
OPT_Res <- BayesianOptimization(xgb_cv_bayes,
bounds = list(max.depth = c(2L, 6L),
min_child_weight = c(1L, 10L),
subsample = c(0.5, 0.8)),
init_points = 10, n_iter = 20,
acq = "ucb", kappa = 2.576, eps = 0.0,
verbose = TRUE)
No, should be Score = -cv$dt[, min(test.logloss.mean)]
Thank you. That worked great.
Can rBayesianOptimization find good parameters for XGBoost when the objective requires a minimizing of error score like logloss?