-
Following up on a discussion I had with @nickp60 earlier on whether or not we should retune the `bbduk` parameters during trimming (given that we have some reads that look like adapter/empty sequence …
-
We're super happy that you made this.
I can put a PR into dials for the new tuning parameters (if you like) or you could take a dials+scales dependency and add them here.
Also, we should modif…
-
Currently there is some Hyper parameter optimization using optuna. I believe we can increase the currency of this search using Ray on Spark
-
I am using the `bfastlite()` function to run a time-series analysis. From the author's [paper](https://www.mdpi.com/2072-4292/13/16/3308) (table 2), I quote:
> Needs parameter tuning to optimise …
-
For now, for parameter tuning, we offer both `random` (with the user selecting a max number of parameter combinations to try), and `grid` where it tries them all.
1. We need to document this somewh…
ksaur updated
2 weeks ago
-
Hi there,
I've been using a little bit of `~dirt.orbits.each` hackery to make orbits pick up a `tuningName` parameter, and if present pass that tuning name onto the associated Event. This allows spec…
-
Hi,
How can I tune the hyperparameters for the model? I couldn't find any reference in this case. Any guidance is appreciated.
-
It would be nice to have a language facility that can allow a user to add their own tuning parameters. A tuning parameter is essentially a hidden argument of type `i64` that you configure through the …
athas updated
1 month ago
-
Thank you for your outstanding work, which has allowed me to quickly start my fine-tuning process. However, I have the following two questions:
1. In the LoRA fine-tuning of the LLaVA series, most …
-
I want to tune hyperparameter inside the model which will be best suited for data. how to do that