-
https://arxiv.org/abs/1711.09846
https://dl.acm.org/citation.cfm?doid=3292500.3330649
This is an interesting approach to hyperparameter optimization I just came across. An implementation in DeepC…
-
Dear authors, thank you for your amazing work.
I'm trying to reproduce one of your experiments by using AdamW optimizer (as mentioned in the A.1 appendix of your paper).
Which hyperparameters (**bas…
-
**Is your feature request related to a problem? Please describe.**
Training and fine-tuning models often involve significant manual work, especially when experimenting with different hyperparameters …
-
"File "train_dual.py", line 70
LOGGER.info(colorstr('hyperparameters: ') + ', '.join(f'{k}={v}' for k, v in hyp.items()))
^
Sy…
-
Hi, I'm excited with your TPOT tool and how it infers hyperparameters for binary classifiers. I was wondering whether you have any plans to extend TPOT to unsupervised machine learning, i.e. **cluster…
-
### Motivation
I am attempting to use optuna for hyperparameter optimization of a complex, lightning based deep learning framework. It is essential for this framework to run in a distributed setting.…
-
Hello, as a newcomer to reinforcement learning, I have some questions I would like to ask.
1. Are the hyperparameters in the model suitable for all scenarios?
2. When I was training the model with y…
-
Hello,
I am trying to reproduce OmniMotion results on TAP-Vid DAVIS. I preprocessed and trained the models using the default configs (except for using `num_iters=200_000`). However, when evaluating t…
-
Hi @davidsandberg! Thanks for your work on this repo!
When training with triplet loss using the recommended hyperparameters in the wiki, what kind of results were obtained? It'd be great if I could…
-
Hi there,
I am currently trying to determine the robustness of sparse networks against adversarial examples. For this we are trying to reproduce the results from your paper; specifically the MNIST-…