-
We can employ [hyperopt](https://github.com/hyperopt/hyperopt) or [nevergrad](https://github.com/facebookresearch/nevergrad) for optimized hyperparameter search. Grid Search should be considered as op…
-
**Bug Report Checklist**
- [x] I provided code that demonstrates a minimal reproducible example.
- [?] I confirmed bug exists on the latest mainline of AutoGluon via source install.
- [x] I c…
-
Investigate whether using something like [optuna](https://optuna.org) is better than plain ol' grid search when it comes to the non-deep-learning algorithms that are in SKLL/scikit-learn.
-
Hi there!
Hyperband has shown the ability to do state of the art hyperparameter optimization. I implemented it with skopt dimensions.
Here is a gist if you are interested in incorporating the al…
-
Enabling the possibility to run --optimize with the --trained-agent flag would be great !
In my case, I pre-trained an agent on a simplified task and want to continue training it on the real task (wh…
-
-
Using hyperopt 0.2.7 on Windows 10 and python 3.9.15
```
File "c:\users\pfa2ba\documents\untitled0.py", line 46, in
best = fmin(train_func, space, **fmin_params)
File "C:\Users\pfa2ba\.…
-
Is there a way to implement the Bayesian Optimization method of hyperparameter tuning in the (R-based) PLP pipeline?
-
The Scikit-optimize `BayesSearchCV` class likely has internal functions that help it determine new parameters given scores on a set of older parameters. It would be useful to either introduce Dask to…
-
Since most people here are having smaller datasets and struggling with playing around different hyperparameters, I came across Ray tune library for hyperparameter tuning, has anyone had any experience…