AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.02k stars 92 forks source link

Is the IA^3 adapter on the roadmap? #11

Open AmrishJhingoer opened 1 year ago

AmrishJhingoer commented 1 year ago

Infused Adapter by Inhibiting and Amplifying Inner Activations ((IA)^3) seems promising regarding parameter efficiency vs performance. Could you add support for this please?

https://arxiv.org/pdf/2205.05638.pdf https://docs.adapterhub.ml/methods.html#ia-3

HZQ950419 commented 1 year ago

Sure, we are exploring how to add (IA)^3 to our framework.