-
# 🚀 Feature Request
- It would be awesome if the Optuna **study object** created [here](https://github.com/facebookresearch/hydra/blob/809718cdcd64f9cd930d26dea69f2660a6ffa833/plugins/hydra_optuna_…
-
## Motivation
optuna-integration Comet callback by introduces https://github.com/optuna/optuna-integration/pull/63#issuecomment-2187576883. It would be great to have the example in this rep…
-
We are starting to accumulate some parameters in hapestry, which could benefit from parameter tuning (as with Optuna).
To start with, these seem like good candidates (in order of priority):
- **d…
-
### What is an issue?
The catboost callback does not work with GPUs as noted in https://github.com/optuna/optuna/pull/3903. However the last stable catboost has supported the callback on GPUs ([ref](…
-
### Motivation
Optuna drops Python 3.6 from v3.1, so we can use `__future__.annotations`, which simplifies the code base. See [PEP 563](https://peps.python.org/pep-0563/), [PEP584](https://peps.pyt…
-
When I try to search with optuna, on 8-gpus with ddp strategy training.
The sweeper will start 8 groups of different hyperparameters, so the params shape doesn't match on each gpu.
-
### Motivation
In the `optuna/optuna/visualization`,
- The functions `_is_log_scale()`, `_is_categorical()`, and `_is_numerical()`, which iterates over the trials, are called several times redund…
-
![Parallel_Coordinate_lgbm_clf_shap_optuna](https://user-images.githubusercontent.com/69301816/158623162-af8355db-545a-4b1f-97cc-b6529c2b6011.png)
-
**Description**
The `runs` method of the wandb API fails to return any runs when the key for a filtered config value contains a `.`.
**How to reproduce**
This should return [optuna-bc57c1-0-0…
-
### Feature request
Adding generation configurations to the parameters that can be tuned in a `Trainer`.
### Motivation
When defining the Optuna hyper-parameter space, I would like to invest…