drdataking / webcomments

Web comments by https://utteranc.es/
0 stars 0 forks source link

Hyperparameters Tuning for XGBoost using Bayesian Optimization | Dr.Data.King #4

Open utterances-bot opened 1 year ago

utterances-bot commented 1 year ago

Hyperparameters Tuning for XGBoost using Bayesian Optimization | Dr.Data.King

How to tune your XGBoost model hyperparameters? How to set up parallel computing for your model training which may take hours? This post will help you.

https://www.drdataking.com/post/hyperparameters-tuning-for-xgboost-using-bayesian-optimization/

fupenghzau commented 1 year ago

Thanks for sharing, but I have a question. The xgboost function in R can take weight parameters but the xgboost.cv cannot in my understanding. So, how to use the Bayesian approach to optimize the hyperparameters if I want to assign weights on the input samples?

drdataking commented 1 year ago

Thanks for sharing, but I have a question. The xgboost function in R can take weight parameters but the xgboost.cv cannot in my understanding. So, how to use the Bayesian approach to optimize the hyperparameters if I want to assign weights on the input samples?

I never tried this before and sorry I couldn't offer you any help. Please share with me if you have any experience on this. Thank you.

patogonzalez commented 10 months ago

Thanks for sharing the script. I ran this using my dataset. The AUC for the not-tuned model was 0.921, and 0.719 for the tuned one. Does it make sense? Do you advice me to use the tuned model for my predictions?

drdataking commented 10 months ago

Thanks for sharing the script. I ran this using my dataset. The AUC for the not-tuned model was 0.921, and 0.719 for the tuned one. Does it make sense? Do you advice me to use the tuned model for my predictions?

I'd suggest you use the one with higher AUC if AUC is your criteria for model performance. Thanks for reading the post. Good luck with your model tuning.