-
Goal: Compare random forests with a simple multi layer perceptron in a simple benchmark experiment.
1. We need to define the tasks:
Use three simple small tasks from OpenML: https://mlr3book.ml…
-
Hello Hydra Team,
I am exploring the possibility of integrating Optuna Sweeper for hyperparameter tuning in a multiple processes setup using GridSearch. My objective is to utilize multiple GPUs on …
-
I need to try out different hyperparameters and compare their performance. I would be interested to know if an automated hyperparameter tuning option is available to do a Bayesian optimization as an e…
-
Enabling the possibility to run --optimize with the --trained-agent flag would be great !
In my case, I pre-trained an agent on a simplified task and want to continue training it on the real task (wh…
-
# Hyperparameters Tuning for XGBoost using Bayesian Optimization | Dr.Data.King
How to tune your XGBoost model hyperparameters? How to set up parallel computing for your model training which may take…
-
Hello,
I'm working with a very large dataset consisting of 7.5 million rows and 18 columns, which represents customer purchase behavior. I initially used UMAP for dimensionality reduction and attem…
-
### Is this a unique feature?
- [X] I have checked "open" AND "closed" issues and this is not a duplicate
### Is your feature request related to a problem/unavailable functionality? Please descr…
-
#### Description
Currently, there is no example or demonstration of a hyperparameter optimization function that uses warmstart (GridSearchCV and RandomSearchCV ignore warmstart) and tunes multiple …
-
I try to launch a hyperparameter optimization using clearml (version 1.0.5). The First thing each individual experiment does is checkout the repo using a cached version. It checks out an old commit (a…
-
We can employ [hyperopt](https://github.com/hyperopt/hyperopt) or [nevergrad](https://github.com/facebookresearch/nevergrad) for optimized hyperparameter search. Grid Search should be considered as op…