-
Hi Jared,
Thanks for providing deepsurv. I have a question about hyperparameter serach. I am not sure what is the benefit of using docker oputunity vs simply giving a list of parameters options in t…
-
Perform hyperparameter tuning and benchmark the performance:
- We use grid or random search to perform the tuning (Note that gradient descent is more efficient but sub optimal you should ask the inst…
-
- Wandb Sweep
- optuna
혹시 적용해보신분 있으신가요?
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-
### Motivation
[Hill climbing](https://en.wikipedia.org/wiki/Hill_climbing) is a local search algorithm commonly used for optimization problems. The sampler based on the hill-climbing algorithm aims …
-
First of all, thank you for the fantastic work on the sktime library! I am using the StackingForecaster in combination with ForecastingGridSearchCV for hyperparameter tuning. However, I'm unsure which…
-
I am trying to reproduce the training process from scratch on Voxpopuli-en. I preserved all your original hyperparameters but found that watermark-related losses stayed the same even after 30 epoches.…
-
- [ ] auto hyperparameters search
- [ ] more rlhf methods support
- [ ] more model support
- [ ] the multimodal support
- [ ] auto-parallelizing
- [ ] better dispatcher and monitor
-
Thank you for this wonderful repo!
I have a quick question: are the hyperparameters listed in the README optimized for both finetuning on a specific task and finetuning to support a new language (…