OpenGVLab / LLaMA-Adapter

[ICLR 2024] Fine-tuning LLaMA to follow Instructions within 1 Hour and 1.2M Parameters
GNU General Public License v3.0
5.63k stars 367 forks source link

Can I attach multi adapter? #118

Open daehuikim opened 9 months ago

daehuikim commented 9 months ago

I am curious that i can attach different types of llama adapter on freezed model. For example, Once I train 2 different Llama-adapters. (one llama adapter for legal words the other for one medical words) Then can i give llama2 them? If it's possible, LLM obtain 2 different knowledges or they errupt each other? Anybody know or tried this?

gaopengpjlab commented 9 months ago

Please refer to LoraHub for possible similar ideas.

LoraHub: Efficient Cross-Task Generalization via Dynamic ...

gaopengpjlab commented 9 months ago

Here is another interesting read:

Domain Generalization Using Large Pretrained Models with Mixture-of-Adapters

daehuikim commented 9 months ago

@gaopengpjlab What a kind! Thanks for giving proper references. I am also expecting more ideas that can be adopted to my curiosity. :)