-
Optimization of noise (`sigma_n`, `sigma_p`) and of the number of basis functions (`order`).
Similar to `optimize` for Gaussian Processes.
Good Summary: http://krasserm.github.io/2019/02/23/bayesi…
-
I've been observing that for models that take a large amount of steps to reach the early stopping criteria (~20k+ steps), increasing the learning rate significantly (5e-5 --> 2e-4) often cuts the numb…
-
Setting a strong beta-prior that encourages short lengthscales (e.g., `Gamma(0.25, 0.5)`) can lead to numerical issues during hyperparameter optimization. In particular, `optimize_restarts` samples hy…
-
# Hyperparameters Tuning for XGBoost using Bayesian Optimization | Dr.Data.King
How to tune your XGBoost model hyperparameters? How to set up parallel computing for your model training which may take…
-
Hello, can hyperparameter optimization be done for multi-objective problems?
-
**What are you trying to do?**
I could run my first training/prediction following the tutorials in a notebook! I had a couple of questions on how to achieve the following:
1. How to use `rdkit2d`…
-
**Is your feature request related to a problem? Please describe.**
For both optimization of parameters and the calculation itself it would be great to have functions that analytically calculate deriv…
-
Any idea on how I can do cross validation (like KFold) using KAN model?
Thanks
-
### Motivation for the feature
Models take a long time to train, therefore integrating [sweeps](https://docs.wandb.ai/guides/sweeps) for model hyperparameter tuning will help us converge towards the …
-
Labeled LDA does not support alpha and beta hyperparameter optimization at this time.
How can it be implemented?