PhilippPro / tuneRanger

Automatic tuning of random forests
32 stars 9 forks source link

error in appling tuneRanger to regression type data #4

Closed MManessa closed 5 years ago

MManessa commented 5 years ago

I got problem when tested the function to my data as follows:

df.task <- makeRegrTask(data = df.input, target = "depth") estimateTimeTuneRanger(df.task) Approximated time for tuning: 33M 12S res = tuneRanger(df.task, measure = list(acc), num.trees = 1000, num.threads = 2, iters = 70) Computing y column(s) for design. Not provided. Mapping in parallel: mode = socket; cpus = 8; elements = 30. Error in stopWithJobErrorMessages(inds, vcapply(result.list[inds], as.character)) : Errors occurred in 30 slave jobs, displaying at most 10 of them:

00001: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00002: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00003: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00004: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00005: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00006: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00007: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00008: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00009: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

00010: Error in opts$show.learner.output || inherits(learner, "OptWrapper") : invalid 'x' type in 'x || y'

head(df.input) band1 band2 band3 band4 X1.78 X2.78 X3.78 X1.06 X2.06 1 0.1009626 0.06760476 0.04149496 0.03278008 -4.965621 -4.854553 -5.260519 -3.875896 -4.545680 2 0.1198124 0.07806161 0.04495288 0.03549122 -3.656474 -4.003609 -4.750136 -3.412757 -4.113148 3 0.2992728 0.21156885 0.13459264 0.10647365 -1.583362 -1.885475 -2.319830 -3.252269 -3.632400 4 0.1218781 0.08073661 0.04468688 0.03524475 -3.579518 -3.866826 -4.781367 -3.334871 -3.939409 5 0.1208453 0.08122298 0.04468688 0.03524475 -3.617255 -3.843849 -4.781367 -3.364297 -3.914723 6 0.1128406 0.07392751 0.04282493 0.03376595 -3.971159 -4.260477 -5.032486 -3.499885 -4.185036 X3.06 depth C1a C2a C3a C1b C2b C3b BI12 BI23 1 -9.219036 8.20 1.5808484 3.088686 330.6637 48.22589 94.22447 10087.337 1.867034 5.766680 2 -8.922665 13.88 1.0771155 2.169894 266.1859 30.34879 61.13891 7500.051 1.783721 5.867693 3 -8.917605 5.70 2.7522298 4.025071 794.5272 25.84893 37.80345 7462.196 1.336840 6.342781 4 -8.630425 1.70 0.9894876 1.811165 197.3514 28.07475 51.38822 5599.455 1.642107 5.714534 5 -8.630425 8.90 1.0190371 1.767001 197.3514 28.91316 50.13516 5599.455 1.581493 5.739220 6 -8.599970 14.80 1.1180458 2.218283 183.3997 33.11163 65.69585 5431.497 1.787414 5.434841

could you help me explain the issues?

Regards Manessa