-
### Expected behavior
It seems that the `TPESampler` can get stuck and repeatedly sample parameters that result in `NaN`/`None` without ability to move away from these parameters. I cannot reproduc…
-
### What is an issue?
Optuna has various pruners, and their handling of nan in intermediate values is different from each other. Checking the source code and documenting each handling would be benefi…
-
### What happened + What you expected to happen
Bug: When using Optuna as the search algorithm in Ray Tune, the performance reduces significantly, CPU utilization decreases, and the number of trials …
-
Hi
Im trying to run deep learning optimisation using optuna. It works fine if I have n_trails=1 however if I increase that number to say 2 I get a error AttributeError: _model_call. I have enough c…
-
### What is an issue?
## Problem
I cannot build the document on my MacBook, and my colleagues also cannot. It might be the problem due to the dependencies, but I have no exact idea how to fix this. …
-
### Motivation
I am attempting to use optuna for hyperparameter optimization of a complex, lightning based deep learning framework. It is essential for this framework to run in a distributed setting.…
-
- [ ] Implement a webhook to auto-load new models, but compare performance before deploying to prod server.
- [x] Connect MLflow with DagsHub.
- [x] Track experiments using MLflow.
- [x] Exp-1 Baselin…
-
[Optuna](https://optuna.org/) is a hyperparameter tuning and in general black-box optimisation framework. Using Optuna normally entails heavy computations, which makes it a reasonable candidate for ru…
-
Sorry but it is not a feature request.
Related: https://github.com/optuna/optuna/issues/2941
## Summary
Thank you for integrating Optuna, hyperparameter optimization library. We, Optuna dev-t…
himkt updated
7 months ago
-
#### Describe the issue linked to the documentation
#### Suggest a potential alternative/fix