Open daehuikim opened 9 months ago
Please refer to LoraHub for possible similar ideas.
LoraHub: Efficient Cross-Task Generalization via Dynamic ...
Here is another interesting read:
Domain Generalization Using Large Pretrained Models with Mixture-of-Adapters
@gaopengpjlab What a kind! Thanks for giving proper references. I am also expecting more ideas that can be adopted to my curiosity. :)
I am curious that i can attach different types of llama adapter on freezed model. For example, Once I train 2 different Llama-adapters. (one llama adapter for legal words the other for one medical words) Then can i give llama2 them? If it's possible, LLM obtain 2 different knowledges or they errupt each other? Anybody know or tried this?