oloBion / Retip

Retip - Retention Time prediction for metabolomics
31 stars 15 forks source link

Need help, Error with fit.xgboost CPU Time Limit Reached and Kera munmap_chunk( #2

Closed yguitton closed 3 years ago

yguitton commented 3 years ago

Hi,

Can you help me with that error obtained under Rstudio while running plasma tutorial example?

Train Model

setSessionTimeLimit(cpu = Inf, elapsed = Inf) xgb <- fit.xgboost(training) [1] "Computing model Xgboost ... Please wait ..." Error in selectChildren(ac[!fin], -1) : reached CPU time limit

keras <- fit.keras(training,testing) Error in `/usr/lib/rstudio-server/bin/rsession': munmap_chunk(): invalid pointer: 0x00007ffd60046b60

others models are fine in my configuration

Many thanks Yann

yguitton commented 3 years ago

Hi,

I managed to create an xgb model but with some warnings( seebelow), any clue for me?

Thanks for your great tool

xgb <- fit.xgboost(training) [1] "Computing model Xgboost ... Please wait ..." [1] "End training" Warning messages: 1: model fit failed for Fold09: eta=0.02, max_depth=4, gamma=1, colsample_bytree=0.5, min_child_weight=10, subsample=0.5, nrounds=1000 Error in xgboost::xgb.train(list(eta = param$eta, max_depth = param$max_depth, : reached CPU time limit

2: In nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures.

PaoloBnn commented 3 years ago

Hi, Thanks!!

I never get this problem, and I do not found any solution :-(

But you can try to modify the function to have less nrounds and see if you do not reach CPU limits. Did you use the prepare.wizard() function? Try to use or not use (you need to restart R to undo the parallel function)

So you can create a new xgboost function with this code where i reduced the nrouds parameter to 500:

      #new xgboost model CPU limit 
      fit.xgboost2 <- function(x){

      # set up train control for 10 times cross validation
      cv.ctrl <-caret::trainControl(method = "cv",number = 10)

      # These are the tune grid parameters
      xgb.grid <- base::expand.grid(nrounds=c(300,400,500),
                                    max_depth = c(2,3,4,5),
                                    eta = c(0.01,0.02),
                                    gamma = c(1),
                                    colsample_bytree = c(0.5),
                                    subsample = c(0.5),
                                    min_child_weight = c(10))

      print("Computing model Xgboost  ... Please wait ...")

      # Model training using the above parameters
      set.seed(101)
      model_xgb <-caret::train(RT ~.,
                               data=x,
                               method="xgbTree",
                               metric = "RMSE",
                               trControl=cv.ctrl,
                               tuneGrid=xgb.grid)

      print("End training")

      return(model_xgb)
}

Hope this can help.

Regarding Keras you have an installation problem. This can be the solution: https://tensorflow.rstudio.com/reference/keras/install_keras/

Thanks! Best Paolo

yguitton commented 3 years ago

Hi Pablo,

Thanks for your answer, I will give it a try

Best Yann