-
Several options for achieving good coverage of hyperparameter space more efficiently than systematic grid search are discussed in scikit-learn documentations. Can pick or experiment and pick to enable…
-
Docs
https://docs.ray.io/en/master/tune/index.html
Core example
https://github.com/ray-project/ray/blob/master/python/ray/tune/examples/pbt_transformers/pbt_transformers.py
etgld updated
1 month ago
-
To evaluate the most effective model/hyperparam combination we must run a grid search using the evaluation pipeline. This issue depends on #6
-
### Description
Scikit offers a gridsearch, where a large number of candidates (parameter configurations) is first trained on a very small batch of the training data. With each step the most promisin…
-
### Feature Request: MLOps Integration for NLU Section Inspired by MLflow Features
#### Description
To enhance Hexabot's NLU capabilities, we propose integrating MLOps-inspired features to strea…
-
### Feature request
Adding generation configurations to the parameters that can be tuned in a `Trainer`.
### Motivation
When defining the Optuna hyper-parameter space, I would like to invest…
-
Is the hyperparameter setting in Appendix B of the paper the optimal one?
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussion…
-