-
Problem: I am unable to catch the TOutOfMemoryError using a try-catch block while fine-tuning the hyperparameters with Optuna.
catboost version: 1.2.3
Operating System: ubuntu
-
Add here: [https://github.com/optuna/optuna-examples/|https://github.com/optuna/optuna-examples/|smart-link]
and make a PR on the main repo to add us to the README.
-
https://github.com/AI4Finance-Foundation/FinRL-Tutorials/blob/master/4-Optimization/FinRL_HyperparameterTuning_using_Optuna_basic.ipynb
I am getting the following error while recreating the above n…
-
Pruners are not currently supported through the hydra optuna sweeper plugin: https://github.com/facebookresearch/hydra/issues/1710 .
Adding the pruner to the sweeper ourselves might be doable, as i…
-
# 🐛 Bug
## Description
I perform hp search for a deep learning model and there are situations when model diverges and starts producing nans in it's output. My code exits gracefully returning tuple o…
-
## Reference
[https://blog.gtwang.org/programming/python-beautiful-soup-module-scrape-web-pages-tutorial/](https://blog.gtwang.org/programming/python-beautiful-soup-module-scrape-web-pages-tutorial/)…
-
Hi,
The optimization didn't work. So I just added the line :
"metrics_callback.on_validation_end( trainer)" after line 209 .
I also modified the class :
class MetricsCallback(Callback…
-
- PyTorch-Forecasting version: 0.9.2
- PyTorch version: 1.10.1
- Python version: 3.8.12
- Operating System: win10 x64
### Expected behavior
I executed code with optimize_hyperparameters funct…
-
In order to maximize the potential users of DeepCave, we should aim to support more HPO tools. In particular, we should write converters for
- [ ] SyneTune
- [ ] BoTorch
- [ ] HEBO
- [ ] Ray Tune
…
-