AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.08k stars 103 forks source link

AdapterH, AdapterP code #35

Open ChaoGaoUCR opened 1 year ago

ChaoGaoUCR commented 1 year ago

Dear Author,

Thanks for the great project. I am trying to find AdapterH and AdapterP codes. But I can't find them in the finetune.py I wonder where is that part

Best

HZQ950419 commented 1 year ago

Hi,

The code is here LLM-Adapters/peft/src/peft/tuners/bottleneck.py. We use different arguments to indicate which adapter to use.

ChaoGaoUCR commented 1 year ago

Thanks a lot I will check it now!