TUDB-Labs / MixLoRA

State-of-the-art Parameter-Efficient MoE Method for NLP Tasks
Apache License 2.0
34 stars 2 forks source link

[help] how to training DoRA checkpoints #2

Open sorobedio opened 2 weeks ago

sorobedio commented 2 weeks ago

do you have individual checkpoint of Dora for each tasks? i need them for an experiment. if you have a simple way to reproduced those results with Dora please share with me. thank you

mikecovlee commented 2 weeks ago

Thanks for paying attention on our work! Just add --use_dora and LoRA template when generate configuration with launch.py.

python ./launch.py gen --template lora --use_dora True --tasks <arc-c/arc-e/boolq/obqa/piqa/siqa/hellaswag/winogrande>