mb706 / autoxgboost3

autoxgboost emulation for mlr3
Other
4 stars 0 forks source link

Parameter xgboost.cale_pos_weight not available on dataset gisette #3

Closed SebGGruber closed 4 years ago

SebGGruber commented 4 years ago

Code to reproduce the error (the first part is for downloading the data):

ll = mlr3misc::encapsulate("callr", function(task.id) {
  library(mlr)
  library(OpenML)

  setOMLConfig(arff.reader = "RWeka")
  OMLtask = convertOMLTaskToMlr(getOMLTask(task.id))
  data = getTaskData(OMLtask$mlr.task)
  rin = OMLtask$mlr.rin
  rdesc = OMLtask$mlr.rin$desc
  task.id = OMLtask$mlr.task$task.desc$id
  task.type = OMLtask$mlr.task$type
  target = OMLtask$mlr.task$task.desc$target

  return(list(task.id = task.id, task.type = task.type, data = data, rin = rin, rdesc = rdesc, target = target))
}
, .args = list(task.id = 167213))

res = ll$result
row.names(res$data) = as.integer(row.names(res$data))

if (res$task.type == "classif") {
  task = mlr3::TaskClassif$new(id = res$task.id, backend = res$data, target = res$target)
} else {
  task = mlr3::TaskRegr$new(id = res$task.id, backend = res$data, target = res$target)
}

library(autoxgboost)
library(mlr3tuning)
library(checkmate)
data="data/gisette"
task = readRDS(file.path(data, "task.rds"))
axgb_settings = autoxgboost_space(task, tune.threshold = FALSE)
rsmp_inner = axgb_settings$resampling
learner = axgb_settings$learner
ps = axgb_settings$searchspace
#ps$add(learner$param_set$params$xgboost.booster)
#ps$add(ParamFct$new(id = "xgboost.booster", levels = "gblinear", default = "gblinear"))
ti = TuningInstance$new(task = task, learner = learner, resampling = rsmp_inner, param_set = ps, measures = msr("classif.ce"), terminator = term("evals", n_evals = 1))
tuner = tnr("random_search")
tuner$tune(ti)

throwing the error:

> tuner$tune(ti)
INFO  [12:11:27.547] Starting to tune 9 parameters with '<TunerR
andomSearch>' and '<TerminatorEvals>'
INFO  [12:11:27.580] Terminator settings: n_evals=1
INFO  [12:11:27.632] Evaluating 1 configurations
INFO  [12:11:27.639]  xgboost.eta xgboost.gamma xgboost.max_dept
h
INFO  [12:11:27.639]    0.1015153     -6.913219                1
4
INFO  [12:11:27.639]  xgboost.colsample_bytree xgboost.colsample
_bylevel
INFO  [12:11:27.639]                 0.6309831                 0
.6852687
INFO  [12:11:27.639]  xgboost.lambda xgboost.alpha xgboost.subsa
mple
INFO  [12:11:27.639]       -7.631537      4.986284         0.741
8983
INFO  [12:11:27.639]  xgboosts.cale_pos_weight
INFO  [12:11:27.639]                 -4.246136
Error in (function (xs)  :
  Assertion on 'xs' failed: Parameter 'xgboosts.cale_pos_weight'
 not available..
SebGGruber commented 4 years ago

dropping the parameter xgboosts.cale_pos_weight seems to prevent the error. The question is, if this has any further undesired consequences...

SebGGruber commented 4 years ago

ok, so i figured out the parameter is a typo and supposed to be called xgboosts.scale_pos_weight. But unfortunately fixing this typo in the autoxgboost code still leads to the same error

Error in (function (xs)  :
  Assertion on 'xs' failed: Parameter 'xgboosts.scale_po
s_weight' not available..
SebGGruber commented 4 years ago

i'm so done.... it's not xgboosts.scale_pos_weight, but xgboost.scale_pos_weight..................... the s was originally not missing, but placed before the dot................ it work's now xD