ymattu / MlBayesOpt

R package to tune parameters for machine learning(Support Vector Machine, Random Forest, and Xgboost), using bayesian optimization with gaussian process
Other
45 stars 15 forks source link

XGBoost 0.81.0.1 breaks MIBayesOpt's tests #64

Open hetong007 opened 5 years ago

hetong007 commented 5 years ago

Hello @ymattu ,

This is Tong maintaining XGBoost R-package. Recently we are planning to submit version 0.81.0.1 to CRAN.

However in the process CRAN alerts that our update breaks your test. The error message is attached. Would appreciate if you could help to check and update. Thanks!

Package: MlBayesOpt Check: tests New result: ERROR Running ‘testthat.R’ [48s/48s] Running the tests in ‘tests/testthat.R’ failed. Complete output:

library(testthat) library(MlBayesOpt)

test_check("MlBayesOpt") elapsed = 0.02 Round = 1 mtry_opt = 3.6634 min_node_size = 7.0000 Value = 0.1800 elapsed = 0.02 Round = 2 mtry_opt = 5.4408 min_node_size = 4.0000 Value = 0.1400 elapsed = 0.01 Round = 3 mtry_opt = 3.6190 min_node_size = 7.0000 Value = 0.1800 elapsed = 0.01 Round = 4 mtry_opt = 2.6933 min_node_size = 3.0000 Value = 0.2000 elapsed = 0.01 Round = 5 mtry_opt = 3.5290 min_node_size = 3.0000 Value = 0.1600 elapsed = 0.01 Round = 6 mtry_opt = 8.5781 min_node_size = 5.0000 Value = 0.1500 elapsed = 0.01 Round = 7 mtry_opt = 6.2937 min_node_size = 5.0000 Value = 0.1600 elapsed = 0.01 Round = 8 mtry_opt = 8.1154 min_node_size = 4.0000 Value = 0.1400 elapsed = 0.01 Round = 9 mtry_opt = 3.7041 min_node_size = 4.0000 Value = 0.1700 elapsed = 0.01 Round = 10 mtry_opt = 4.4780 min_node_size = 9.0000 Value = 0.1800 elapsed = 0.01 Round = 11 mtry_opt = 1.9407 min_node_size = 1.0000 Value = 0.1600 elapsed = 0.01 Round = 12 mtry_opt = 7.0937 min_node_size = 6.0000 Value = 0.1300 elapsed = 0.01 Round = 13 mtry_opt = 2.1344 min_node_size = 8.0000 Value = 0.1500 elapsed = 0.01 Round = 14 mtry_opt = 7.1353 min_node_size = 2.0000 Value = 0.1400 elapsed = 0.01 Round = 15 mtry_opt = 7.7371 min_node_size = 8.0000 Value = 0.1400 elapsed = 0.01 Round = 16 mtry_opt = 7.2140 min_node_size = 9.0000 Value = 0.1700 elapsed = 0.01 Round = 17 mtry_opt = 2.0706 min_node_size = 5.0000 Value = 0.1700 elapsed = 0.01 Round = 18 mtry_opt = 7.4475 min_node_size = 3.0000 Value = 0.1400 elapsed = 0.01 Round = 19 mtry_opt = 8.1743 min_node_size = 5.0000 Value = 0.1700 elapsed = 0.01 Round = 20 mtry_opt = 8.4158 min_node_size = 1.0000 Value = 0.1500 elapsed = 0.01 Round = 21 mtry_opt = 2.5509 min_node_size = 3.0000 Value = 0.1700

 Best Parameters Found: 
Round = 4   mtry_opt = 2.6933       min_node_size = 3.0000  Value = 0.2000 
List of 4
 $ Best_Par  : Named num [1:2] 2.69 3
  ..- attr(*, "names")= chr [1:2] "mtry_opt" "min_node_size"
 $ Best_Value: num 0.2
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round        : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ mtry_opt     : num [1:21] 3.66 5.44 3.62 2.69 3.53 ...
  ..$ min_node_size: num [1:21] 7 4 7 3 3 5 5 4 4 9 ...
  ..$ Value        : num [1:21] 0.18 0.14 0.18 0.2 0.16 0.15 0.16 0.14 0.17 0.18 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        1 obs. of  21 variables:
  ..$ V1 : num 0.18
  ..$ V2 : num 0.14
  ..$ V3 : num 0.18
  ..$ V4 : num 0.2
  ..$ V5 : num 0.16
  ..$ V6 : num 0.15
  ..$ V7 : num 0.16
  ..$ V8 : num 0.14
  ..$ V9 : num 0.17
  ..$ V10: num 0.18
  ..$ V11: num 0.16
  ..$ V12: num 0.13
  ..$ V13: num 0.15
  ..$ V14: num 0.14
  ..$ V15: num 0.14
  ..$ V16: num 0.17
  ..$ V17: num 0.17
  ..$ V18: num 0.14
  ..$ V19: num 0.17
  ..$ V20: num 0.15
  ..$ V21: num 0.17
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.01      Round = 1       gamma_opt = 3.3299      cost_opt = 61.5259      Value = 0.1900 
elapsed = 0.01      Round = 2       gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2100 
elapsed = 0.01      Round = 3       gamma_opt = 3.2744      cost_opt = 70.8278      Value = 0.1700 
elapsed = 0.01      Round = 4       gamma_opt = 2.1175      cost_opt = 21.9740      Value = 0.1600 
elapsed = 0.01      Round = 5       gamma_opt = 3.1619      cost_opt = 19.3146      Value = 0.1600 
elapsed = 0.01      Round = 6       gamma_opt = 9.4727      cost_opt = 46.3378      Value = 0.1600 
elapsed = 0.01      Round = 7       gamma_opt = 6.6175      cost_opt = 41.6790      Value = 0.1400 
elapsed = 0.01      Round = 8       gamma_opt = 8.8943      cost_opt = 33.0888      Value = 0.1300 
elapsed = 0.01      Round = 9       gamma_opt = 3.3808      cost_opt = 29.9110      Value = 0.0800 
elapsed = 0.01      Round = 10      gamma_opt = 4.3481      cost_opt = 88.7062      Value = 0.1500 
elapsed = 0.01      Round = 11      gamma_opt = 1.1767      cost_opt = 5.2563       Value = 0.1300 
elapsed = 0.01      Round = 12      gamma_opt = 7.6174      cost_opt = 60.4227      Value = 0.1500 
elapsed = 0.01      Round = 13      gamma_opt = 1.4188      cost_opt = 79.6450      Value = 0.1700 
elapsed = 0.01      Round = 14      gamma_opt = 7.6693      cost_opt = 6.2103       Value = 0.0900 
elapsed = 0.01      Round = 15      gamma_opt = 8.4215      cost_opt = 78.2717      Value = 0.1300 
elapsed = 0.01      Round = 16      gamma_opt = 7.7677      cost_opt = 83.7658      Value = 0.1800 
elapsed = 0.01      Round = 17      gamma_opt = 1.3391      cost_opt = 45.6691      Value = 0.1100 
elapsed = 0.01      Round = 18      gamma_opt = 8.0596      cost_opt = 22.1903      Value = 0.1500 
elapsed = 0.01      Round = 19      gamma_opt = 8.9679      cost_opt = 46.9767      Value = 0.2000 
elapsed = 0.01      Round = 20      gamma_opt = 9.2699      cost_opt = 3.9481       Value = 0.1100 
elapsed = 0.01      Round = 21      gamma_opt = 9.0152      cost_opt = 39.2284      Value = 0.2000 

 Best Parameters Found: 
Round = 2   gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2100 
List of 4
 $ Best_Par  : Named num [1:2] 5.55 28.76
  ..- attr(*, "names")= chr [1:2] "gamma_opt" "cost_opt"
 $ Best_Value: num 0.21
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round    : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ gamma_opt: num [1:21] 3.33 5.55 3.27 2.12 3.16 ...
  ..$ cost_opt : num [1:21] 61.5 28.8 70.8 22 19.3 ...
  ..$ Value    : num [1:21] 0.19 0.21 0.17 0.16 0.16 0.16 0.14 0.13 0.08 0.15 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        100 obs. of  21 variables:
  ..$ V1 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V2 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V3 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V4 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V5 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V6 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V7 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V8 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V9 : Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V10: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V11: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V12: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V13: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V14: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V15: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V16: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V17: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V18: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V19: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..$ V20: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 6 6 ...
  ..$ V21: Factor w/ 10 levels "0","1","2","3",..: 6 6 6 6 6 6 6 1 10 6 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.01      Round = 1       gamma_opt = 3.3299      cost_opt = 61.5259      Value = 0.1900 
elapsed = 0.01      Round = 2       gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2300 
elapsed = 0.01      Round = 3       gamma_opt = 3.2744      cost_opt = 70.8278      Value = 0.1900 
elapsed = 0.01      Round = 4       gamma_opt = 2.1175      cost_opt = 21.9740      Value = 0.1900 
elapsed = 0.01      Round = 5       gamma_opt = 3.1619      cost_opt = 19.3146      Value = 0.1900 
elapsed = 0.01      Round = 6       gamma_opt = 9.4727      cost_opt = 46.3378      Value = 0.2200 
elapsed = 0.01      Round = 7       gamma_opt = 6.6175      cost_opt = 41.6790      Value = 0.2200 
elapsed = 0.01      Round = 8       gamma_opt = 8.8943      cost_opt = 33.0888      Value = 0.2200 
elapsed = 0.01      Round = 9       gamma_opt = 3.3808      cost_opt = 29.9110      Value = 0.1900 
elapsed = 0.01      Round = 10      gamma_opt = 4.3481      cost_opt = 88.7062      Value = 0.2300 
elapsed = 0.01      Round = 11      gamma_opt = 1.1767      cost_opt = 5.2563       Value = 0.2000 
elapsed = 0.01      Round = 12      gamma_opt = 7.6174      cost_opt = 60.4227      Value = 0.2200 
elapsed = 0.01      Round = 13      gamma_opt = 1.4188      cost_opt = 79.6450      Value = 0.1800 
elapsed = 0.01      Round = 14      gamma_opt = 7.6693      cost_opt = 6.2103       Value = 0.2200 
elapsed = 0.01      Round = 15      gamma_opt = 8.4215      cost_opt = 78.2717      Value = 0.2300 
elapsed = 0.01      Round = 16      gamma_opt = 7.7677      cost_opt = 83.7658      Value = 0.2200 
elapsed = 0.01      Round = 17      gamma_opt = 1.3391      cost_opt = 45.6691      Value = 0.1800 
elapsed = 0.01      Round = 18      gamma_opt = 8.0596      cost_opt = 22.1903      Value = 0.2200 
elapsed = 0.01      Round = 19      gamma_opt = 8.9679      cost_opt = 46.9767      Value = 0.2200 
elapsed = 0.01      Round = 20      gamma_opt = 9.2699      cost_opt = 3.9481       Value = 0.1800 
elapsed = 0.01      Round = 21      gamma_opt = 9.6352      cost_opt = 14.7148      Value = 0.2200 

 Best Parameters Found: 
Round = 2   gamma_opt = 5.5515      cost_opt = 28.7558      Value = 0.2300 
List of 4
 $ Best_Par  : Named num [1:2] 5.55 28.76
  ..- attr(*, "names")= chr [1:2] "gamma_opt" "cost_opt"
 $ Best_Value: num 0.23
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  4 variables:
  ..$ Round    : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ gamma_opt: num [1:21] 3.33 5.55 3.27 2.12 3.16 ...
  ..$ cost_opt : num [1:21] 61.5 28.8 70.8 22 19.3 ...
  ..$ Value    : num [1:21] 0.19 0.23 0.19 0.19 0.19 0.22 0.22 0.22 0.19 0.23 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        1 obs. of  21 variables:
  ..$ V1 : num 0.19
  ..$ V2 : num 0.23
  ..$ V3 : num 0.19
  ..$ V4 : num 0.19
  ..$ V5 : num 0.19
  ..$ V6 : num 0.22
  ..$ V7 : num 0.22
  ..$ V8 : num 0.22
  ..$ V9 : num 0.19
  ..$ V10: num 0.23
  ..$ V11: num 0.2
  ..$ V12: num 0.22
  ..$ V13: num 0.18
  ..$ V14: num 0.22
  ..$ V15: num 0.23
  ..$ V16: num 0.22
  ..$ V17: num 0.18
  ..$ V18: num 0.22
  ..$ V19: num 0.22
  ..$ V20: num 0.18
  ..$ V21: num 0.22
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.02      Round = 1       eta_opt = 0.2854        max_depth_opt = 5.0000  nrounds_opt = 112.9858  subsample_opt = 0.4052  bytree_opt = 0.5438     Value = -0.3026 
elapsed = 0.01      Round = 2       eta_opt = 0.2589        max_depth_opt = 5.0000  nrounds_opt = 147.5089  subsample_opt = 0.8555  bytree_opt = 0.4354     Value = -0.1330 
elapsed = 0.01      Round = 3       eta_opt = 0.7183        max_depth_opt = 5.0000  nrounds_opt = 109.4287  subsample_opt = 0.4120  bytree_opt = 0.7854     Value = -0.0753 
elapsed = 0.01      Round = 4       eta_opt = 0.4457        max_depth_opt = 4.0000  nrounds_opt = 92.0318   subsample_opt = 0.4004  bytree_opt = 0.9258     Value = -0.0841 
elapsed = 0.01      Round = 5       eta_opt = 0.7929        max_depth_opt = 6.0000  nrounds_opt = 76.3611   subsample_opt = 0.5287  bytree_opt = 0.8673     Value = -0.0526 
elapsed = 0.01      Round = 6       eta_opt = 0.5479        max_depth_opt = 5.0000  nrounds_opt = 78.9520   subsample_opt = 0.9030  bytree_opt = 0.8784     Value = -0.0263 
elapsed = 0.01      Round = 7       eta_opt = 0.7459        max_depth_opt = 6.0000  nrounds_opt = 98.4645   subsample_opt = 0.8779  bytree_opt = 0.6732     Value = -0.0263 
elapsed = 0.01      Round = 8       eta_opt = 0.9927        max_depth_opt = 4.0000  nrounds_opt = 116.6771  subsample_opt = 0.4510  bytree_opt = 0.6461     Value = -0.0351 
elapsed = 0.01      Round = 9       eta_opt = 0.4420        max_depth_opt = 5.0000  nrounds_opt = 129.5805  subsample_opt = 0.7996  bytree_opt = 0.8865     Value = -0.0175 
elapsed = 0.01      Round = 10      eta_opt = 0.7997        max_depth_opt = 5.0000  nrounds_opt = 106.6147  subsample_opt = 0.9646  bytree_opt = 0.7630     Value = -0.0263 
elapsed = 0.01      Round = 11      eta_opt = 0.9412        max_depth_opt = 6.0000  nrounds_opt = 152.1588  subsample_opt = 0.4912  bytree_opt = 0.7928     Value = -0.0577 
elapsed = 0.01      Round = 12      eta_opt = 0.2909        max_depth_opt = 5.0000  nrounds_opt = 96.4243   subsample_opt = 0.7413  bytree_opt = 0.6119     Value = -0.0943 
elapsed = 0.01      Round = 13      eta_opt = 0.6865        max_depth_opt = 6.0000  nrounds_opt = 111.3159  subsample_opt = 0.4600  bytree_opt = 0.5622     Value = -0.1579 
elapsed = 0.01      Round = 14      eta_opt = 0.2130        max_depth_opt = 5.0000  nrounds_opt = 99.9155   subsample_opt = 0.3928  bytree_opt = 0.9956     Value = -0.1491 
elapsed = 0.01      Round = 15      eta_opt = 0.3405        max_depth_opt = 5.0000  nrounds_opt = 128.5783  subsample_opt = 0.7814  bytree_opt = 0.7801     Value = -0.0351 
elapsed = 0.01      Round = 16      eta_opt = 0.4475        max_depth_opt = 6.0000  nrounds_opt = 93.2215   subsample_opt = 0.2824  bytree_opt = 0.5279     Value = -0.3428 
elapsed = 0.01      Round = 17      eta_opt = 0.1121        max_depth_opt = 4.0000  nrounds_opt = 113.0691  subsample_opt = 0.7400  bytree_opt = 0.4776     Value = -0.1367 
elapsed = 0.01      Round = 18      eta_opt = 0.4441        max_depth_opt = 5.0000  nrounds_opt = 138.9680  subsample_opt = 0.2095  bytree_opt = 0.6869     Value = -0.5022 
elapsed = 0.01      Round = 19      eta_opt = 0.8827        max_depth_opt = 5.0000  nrounds_opt = 77.5822   subsample_opt = 0.3209  bytree_opt = 0.9544     Value = -0.1053 
elapsed = 0.01      Round = 20      eta_opt = 0.4063        max_depth_opt = 5.0000  nrounds_opt = 148.7789  subsample_opt = 0.2290  bytree_opt = 0.7593     Value = -0.3567 
elapsed = 0.01      Round = 21      eta_opt = 1.0000        max_depth_opt = 4.0000  nrounds_opt = 106.3408  subsample_opt = 0.6443  bytree_opt = 0.6353     Value = -0.0577 

 Best Parameters Found: 
Round = 9   eta_opt = 0.4420        max_depth_opt = 5.0000  nrounds_opt = 129.5805  subsample_opt = 0.7996  bytree_opt = 0.8865     Value = -0.0175 
List of 4
 $ Best_Par  : Named num [1:5] 0.442 5 129.58 0.8 0.887
  ..- attr(*, "names")= chr [1:5] "eta_opt" "max_depth_opt" "nrounds_opt" "subsample_opt" ...
 $ Best_Value: num -0.0175
 $ History   :Classes 'data.table' and 'data.frame':        21 obs. of  7 variables:
  ..$ Round        : int [1:21] 1 2 3 4 5 6 7 8 9 10 ...
  ..$ eta_opt      : num [1:21] 0.285 0.259 0.718 0.446 0.793 ...
  ..$ max_depth_opt: num [1:21] 5 5 5 4 6 5 6 4 5 5 ...
  ..$ nrounds_opt  : num [1:21] 113 147.5 109.4 92 76.4 ...
  ..$ subsample_opt: num [1:21] 0.405 0.855 0.412 0.4 0.529 ...
  ..$ bytree_opt   : num [1:21] 0.544 0.435 0.785 0.926 0.867 ...
  ..$ Value        : num [1:21] -0.3026 -0.133 -0.0753 -0.0841 -0.0526 ...
  ..- attr(*, ".internal.selfref")=<externalptr> 
 $ Pred      :Classes 'data.table' and 'data.frame':        100 obs. of  210 variables:
  ..$ V1    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V2    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V3    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V4    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V5    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V6    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V7    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V8    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V9    : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V10   : num [1:100] 0 0 0 9 0 9 5 5 0 6 ...
  ..$ V1.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V2.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V3.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V4.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V5.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V6.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V7.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V8.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V9.1  : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V10.1 : num [1:100] 5 5 5 8 9 8 3 0 5 8 ...
  ..$ V1.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V2.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V3.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V4.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V5.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V6.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V7.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V8.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V9.2  : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V10.2 : num [1:100] 9 7 5 7 9 7 3 0 6 6 ...
  ..$ V1.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V2.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V3.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V4.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V5.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V6.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V7.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V8.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V9.3  : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V10.3 : num [1:100] 7 7 3 8 7 9 0 9 6 6 ...
  ..$ V1.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V2.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V3.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V4.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V5.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V6.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V7.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V8.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V9.4  : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V10.4 : num [1:100] 9 6 3 8 9 8 3 0 6 6 ...
  ..$ V1.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V2.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V3.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V4.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V5.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V6.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V7.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V8.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V9.5  : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V10.5 : num [1:100] 8 7 3 8 7 9 3 0 6 6 ...
  ..$ V1.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V2.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V3.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V4.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V5.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V6.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V7.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V8.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V9.6  : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V10.6 : num [1:100] 8 7 3 8 9 5 3 0 6 5 ...
  ..$ V1.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V2.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V3.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V4.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V5.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V6.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V7.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V8.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V9.7  : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V10.7 : num [1:100] 8 6 9 8 0 8 2 0 6 6 ...
  ..$ V1.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V2.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V3.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V4.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V5.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V6.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V7.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V8.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V9.8  : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V10.8 : num [1:100] 7 7 3 8 9 8 3 0 6 6 ...
  ..$ V1.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V2.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V3.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V4.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V5.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V6.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V7.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V8.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  ..$ V9.9  : num [1:100] 8 7 3 8 9 9 3 0 6 7 ...
  .. [list output truncated]
  ..- attr(*, ".internal.selfref")=<externalptr> 
elapsed = 0.04      Round = 1       eta_opt = 0.3996        max_depth_opt = 5.0000  nrounds_opt = 103.8797  subsample_opt = 0.6901  bytree_opt = 0.5783     Value = 1.0000 
elapsed = 0.05      Round = 2       eta_opt = 0.5996        max_depth_opt = 5.0000  nrounds_opt = 125.7482  subsample_opt = 0.3096  bytree_opt = 0.6693     Value = 1.0000 
elapsed = 0.03      Round = 3       eta_opt = 0.3946        max_depth_opt = 5.0000  nrounds_opt = 73.3337   subsample_opt = 0.1606  bytree_opt = 0.8845     Value = 0.1800 
elapsed = 0.04      Round = 4       eta_opt = 0.2905        max_depth_opt = 4.0000  nrounds_opt = 129.3648  subsample_opt = 0.1475  bytree_opt = 0.5431     Value = 0.1800 
elapsed = 0.04      Round = 5       eta_opt = 0.3845        max_depth_opt = 4.0000  nrounds_opt = 106.4619  subsample_opt = 0.3976  bytree_opt = 0.4083     Value = 1.0000 
elapsed = 0.04      Round = 6       eta_opt = 0.9525        max_depth_opt = 5.0000  nrounds_opt = 127.4542  subsample_opt = 0.2646  bytree_opt = 0.4167     Value = 1.0000 
elapsed = 0.04      Round = 7       eta_opt = 0.6955        max_depth_opt = 5.0000  nrounds_opt = 119.2315  subsample_opt = 0.5751  bytree_opt = 0.4965     Value = 1.0000 
elapsed = 0.03      Round = 8       eta_opt = 0.9005        max_depth_opt = 5.0000  nrounds_opt = 81.0287   subsample_opt = 0.8342  bytree_opt = 0.6838     Value = 1.0000 
elapsed = 0.03      Round = 9       eta_opt = 0.4042        max_depth_opt = 5.0000  nrounds_opt = 73.5520   subsample_opt = 0.5461  bytree_opt = 0.6483     Value = 1.0000 
elapsed = 0.05      Round = 10      eta_opt = 0.4913        max_depth_opt = 6.0000  nrounds_opt = 144.0938  subsample_opt = 0.1334  bytree_opt = 0.6559     Value = 0.1900 
elapsed = 0.03      Round = 11      eta_opt = 0.2058        max_depth_opt = 4.0000  nrounds_opt = 72.1364   subsample_opt = 0.4510  bytree_opt = 0.4659     Value = 1.0000 
elapsed = 0.03      Round = 12      eta_opt = 0.7855        max_depth_opt = 5.0000  nrounds_opt = 81.2798   subsample_opt = 0.3255  bytree_opt = 0.7891     Value = 1.0000 
elapsed = 0.05      Round = 13      eta_opt = 0.2276        max_depth_opt = 6.0000  nrounds_opt = 124.3278  subsample_opt = 0.9381  bytree_opt = 0.7298     Value = 1.0000 
elapsed = 0.04      Round = 14      eta_opt = 0.7902        max_depth_opt = 4.0000  nrounds_opt = 115.3598  subsample_opt = 0.6396  bytree_opt = 0.9333     Value = 1.0000 
elapsed = 0.06      Round = 15      eta_opt = 0.8579        max_depth_opt = 6.0000  nrounds_opt = 155.7652  subsample_opt = 0.9330  bytree_opt = 0.6380     Value = 1.0000 
elapsed = 0.05      Round = 16      eta_opt = 0.7991        max_depth_opt = 6.0000  nrounds_opt = 159.1933  subsample_opt = 0.9602  bytree_opt = 0.7328     Value = 1.0000 
elapsed = 0.04      Round = 17      eta_opt = 0.2204        max_depth_opt = 5.0000  nrounds_opt = 112.8439  subsample_opt = 0.8948  bytree_opt = 0.4939     Value = 1.0000 
elapsed = 0.04      Round = 18      eta_opt = 0.8253        max_depth_opt = 4.0000  nrounds_opt = 126.4373  subsample_opt = 0.6642  bytree_opt = 0.4461     Value = 1.0000 
elapsed = 0.05      Round = 19      eta_opt = 0.9071        max_depth_opt = 5.0000  nrounds_opt = 129.1942  subsample_opt = 0.6238  bytree_opt = 0.6919     Value = 1.0000 
elapsed = 0.03      Round = 20      eta_opt = 0.9343        max_depth_opt = 4.0000  nrounds_opt = 86.8685   subsample_opt = 0.9110  bytree_opt = 0.5663     Value = 1.0000 
── 1. Error: (unknown) (@test-xgb_opt.R#9)  ────────────────────────────────────
task 1 failed - "non-finite value supplied by optim"
1: xgb_opt(train_data = tr, train_label = y, test_data = ts, test_label = y, objectfun = "multi:softmax", 
       evalmetric = "merror", classes = 10, init_points = 20, n_iter = 1) at testthat/test-xgb_opt.R:9
2: BayesianOptimization(xgb_holdout, bounds = list(eta_opt = eta_range, max_depth_opt = max_depth_range, 
       nrounds_opt = nrounds_range, subsample_opt = subsample_range, bytree_opt = bytree_range), 
       init_points, init_grid_dt = NULL, n_iter, acq, kappa, eps, optkernel, verbose = TRUE)
3: Utility_Max(DT_bounds, GP, acq = acq, y_max = max(DT_history[, Value]), kappa = kappa, 
       eps = eps) %>% Min_Max_Inverse_Scale_Vec(., lower = DT_bounds[, Lower], upper = DT_bounds[, 
       Upper]) %>% magrittr::set_names(., DT_bounds[, Parameter]) %>% inset(., DT_bounds[Type == 
       "integer", Parameter], round(extract(., DT_bounds[Type == "integer", Parameter])))
4: eval(lhs, parent, parent)
5: eval(lhs, parent, parent)
6: Utility_Max(DT_bounds, GP, acq = acq, y_max = max(DT_history[, Value]), kappa = kappa, 
       eps = eps)
7: foreach(i = 1:nrow(Mat_tries), .combine = "rbind") %do% {
       optim_result <- optim(par = Mat_tries[i, ], fn = Utility, GP = GP, acq = acq, 
           y_max = y_max, kappa = kappa, eps = eps, method = "L-BFGS-B", lower = rep(0, 
               length(DT_bounds[, Lower])), upper = rep(1, length(DT_bounds[, Upper])), 
           control = list(maxit = 100, factr = 5e+11))
       c(optim_result$par, optim_result$value)
   } %>% data.table(.) %>% setnames(., old = names(.), new = c(DT_bounds[, Parameter], 
       "Negetive_Utility"))
8: eval(lhs, parent, parent)
9: eval(lhs, parent, parent)
10: foreach(i = 1:nrow(Mat_tries), .combine = "rbind") %do% {
       optim_result <- optim(par = Mat_tries[i, ], fn = Utility, GP = GP, acq = acq, 
           y_max = y_max, kappa = kappa, eps = eps, method = "L-BFGS-B", lower = rep(0, 
               length(DT_bounds[, Lower])), upper = rep(1, length(DT_bounds[, Upper])), 
           control = list(maxit = 100, factr = 5e+11))
       c(optim_result$par, optim_result$value)
   }
11: e$fun(obj, substitute(ex), parent.frame(), e$data)

══ testthat results  ═══════════════════════════════════════════════════════════
OK: 0 SKIPPED: 0 FAILED: 1
1. Error: (unknown) (@test-xgb_opt.R#9) 

Error: testthat unit tests failed
Execution halted
ymattu commented 5 years ago

@hetong007 Thank you for your message about an error on MlBayesOpt package test code. I'll check the problem and fix it as soon as possible.

hetong007 commented 5 years ago

Brilliant!

Note that we haven't submitted it to CRAN. To build xgboost 0.81.0.1, please

  1. Checkout https://github.com/dmlc/xgboost/tree/release_0.81
  2. execute make Rbuild
  3. Install the tarball.

Thanks.

hetong007 commented 5 years ago

Hi @ymattu , we are planning to release this version to CRAN soon, which will break your package on CRAN if there's no update.

ymattu commented 5 years ago

@hetong007 Thank you for your message. Sorry, I've been very busy since last month. I'm going to maintain my package next week.

ymattu commented 5 years ago

I will mainten this package this weekend