-
Hello Hydra Team,
I am exploring the possibility of integrating Optuna Sweeper for hyperparameter tuning in a multiple processes setup using GridSearch. My objective is to utilize multiple GPUs on …
-
#### Describe the issue linked to the documentation
#### Suggest a potential alternative/fix
-
### What is an issue?
## Problem
I cannot build the document on my MacBook, and my colleagues also cannot. It might be the problem due to the dependencies, but I have no exact idea how to fix this. …
-
### What happened + What you expected to happen
I was attempting to follow the example for automatic hyperparameter tuning described here:
https://nixtlaverse.nixtla.io/neuralforecast/examples/autom…
-
`create_study` accepts `pruner` argument like in [here](https://github.com/optuna/optuna-examples/blob/29720558a2bac9e0bead3a237640f4013f4ee2bd/pytorch/pytorch_lightning_simple.py#L161)
prune opti…
-
@EEG-PK/model
Dobór hiper-parametrów, optymalizacja ogólna. (Optuna?)
-
### Expected behavior
It seems that the `TPESampler` can get stuck and repeatedly sample parameters that result in `NaN`/`None` without ability to move away from these parameters. I cannot reproduc…
-
- PyTorch-Forecasting version: 0.8.4
- PyTorch version: 1.8.0
- Python version: 3.8.8
- Operating System: CentOS
### Expected behavior
I'm working through the _Demand forecasting with the T…
-
### Motivation
I am attempting to use optuna for hyperparameter optimization of a complex, lightning based deep learning framework. It is essential for this framework to run in a distributed setting.…
-
### What happened + What you expected to happen
Bug: When using Optuna as the search algorithm in Ray Tune, the performance reduces significantly, CPU utilization decreases, and the number of trials …