TUDB-Labs / mLoRA

An Efficient "Factory" to Build Multiple LoRA Adapters
Apache License 2.0
275 stars 52 forks source link

Feature/Hyperparameter tuning #264

Closed Saf9933 closed 1 month ago

Saf9933 commented 1 month ago

This pull request introduces a new script mlora_train_optuna.py for hyperparameter tuning using Optuna. The key changes include: