AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.01k stars 91 forks source link

p-tuning in finetune.py? #56

Open smkim0220 opened 4 months ago

smkim0220 commented 4 months ago

lora, bottleneck, prefix tuning are implemented in finetune.py but p-tuning is deleted? In LLM-Adapters/tree/main/peft/src/peft/tuners p-tuning is implemented, is there a reason it's missing from finetune.py?