AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.01k stars 91 forks source link

AdapterH, AdapterP code #35

Open ChaoGaoUCR opened 11 months ago

ChaoGaoUCR commented 11 months ago

Dear Author,

Thanks for the great project. I am trying to find AdapterH and AdapterP codes. But I can't find them in the finetune.py I wonder where is that part

Best

HZQ950419 commented 11 months ago

Hi,

The code is here LLM-Adapters/peft/src/peft/tuners/bottleneck.py. We use different arguments to indicate which adapter to use.

ChaoGaoUCR commented 11 months ago

Thanks a lot I will check it now!