tianyang-x / Mixture-of-Domain-Adapters

Codebase for ACL 2023 paper "Mixture-of-Domain-Adapters: Decoupling and Injecting Domain Knowledge to Pre-trained Language Models' Memories"
MIT License
47 stars 1 forks source link

can we do this with LLAMA model? #1

Closed kiran1501 closed 10 months ago

shizhediao commented 1 year ago

Hi, Yes, we have not tried it but you can directly change the backbone model to LLaMA series.