TUDB-Labs / MixLoRA

State-of-the-art Parameter-Efficient MoE Fine-tuning Method
Apache License 2.0
53 stars 8 forks source link

Code Request for paper. #1

Closed kanseaveg closed 3 months ago

kanseaveg commented 4 months ago

Hello, I am very interested in your paper. May I ask when you plan to publish the source code? I can't wait to reproduce it, thank you.

mikecovlee commented 4 months ago

Our code now available with our m-LoRA framework.

lzw-lzw commented 4 months ago

Hi, where is the launch.py in readme? Thanks.

mikecovlee commented 4 months ago

Hi, where is the launch.py in readme? Thanks.

Please refer to m-LoRA framework