-
**🚀 Feature Description**
Hey, we saw that there is no training code for fine-tuning all parts of XTTS V2. We would like to contribute if it adds value.
The aim can be to make it work very reliab…
-
**Is your feature request related to a problem? Please describe.**
**Describe the solution you'd like**
**Describe alternatives you've considered**
**Additional context**
-
The tuning parameters of the Barker proposal (at a minimum a scalar step size and potentially a diagonal or dense preconditioner) would ideally be adaptively tuned in a initial warm-up stage of the ch…
-
Hi I checked [this document ](https://docs.openshift.com/container-platform/4.16/scalability_and_performance/using-node-tuning-operator.html#advanced-node-tuning-hosted-cluster_node-tuning-operator)an…
-
Hi there,
Will you consider implementing MPO in the near future? If I want to add a PBT for hyperparameter tuning, what would you suggest me to do?
Best regards,
ziqiao
-
I have customer request to add some ITB specific support.
Specifically ability to have a switching point between AlphaN and Speed Density.
There has been conversation on the best ways to sup…
-
I'm trying to replicate your experiment on fine-tuning for ESC50 classification, but I'm experiencing a significant accuracy gap. I'm using the pre-trained model "BEATs_iter3" and I can't achieve an a…
-
can Improve model by add data preprocessing & apply hyperparameter tuning
& I would like to work on this issue.
-
Hi Jaemin,
Thanks for the very interesting paper and releasing your codebase!
I have been working with your codebase for a different multimodal text generation task and observe lower performanc…
-
- Paper name: Unnatural Instructions: Tuning Language Models with (Almost) No Human Labor
- ArXiv Link: https://arxiv.org/abs/2212.09689
To close this issue open a PR with a paper report using t…