-
Hey Laowu, great work. However, i have a specific request, Can you please add the specific commands to run this code and where can we start with the implementation?
-
### Motivation
I would like to use Optuna w/ the PytorchLightningPruningCallback in a code-base with a pre-2.0 version of Pytorch Lightning. As it stands, I need to vendor the callback to support usi…
-
`tests/workflows/test_pytorch_optuna.py::test_hpo` is failing with an obscure CommClosedError / CancelledError:
https://github.com/coiled/benchmarks/actions/runs/8362787642/job/22894238513
as th…
-
Hello Hydra Team,
I am exploring the possibility of integrating Optuna Sweeper for hyperparameter tuning in a multiple processes setup using GridSearch. My objective is to utilize multiple GPUs on …
-
The following error is encountered `Trial.should_prune() takes 1 positional argument but 2 were given` when running an Optimizer instance created using a pruner like seen below:
```python
optimizer …
-
### What happened + What you expected to happen
I was attempting to follow the example for automatic hyperparameter tuning described here:
https://nixtlaverse.nixtla.io/neuralforecast/examples/autom…
-
I have two feature requests related to optimization tracking and strategy management in Trace:
1. **Persistent Storage of Optimization Steps**
How can I store each optimization step (including p…
doxav updated
12 hours ago
-
Currently there is some Hyper parameter optimization using optuna. I believe we can increase the currency of this search using Ray on Spark
-
## Report incorrect documentation
**Location of incorrect documentation**
The qualx [doc](https://github.com/NVIDIA/spark-rapids-tools/blob/main/user_tools/docs/qualx.md#training) says refer to Get s…
-
`create_study` accepts `pruner` argument like in [here](https://github.com/optuna/optuna-examples/blob/29720558a2bac9e0bead3a237640f4013f4ee2bd/pytorch/pytorch_lightning_simple.py#L161)
prune opti…