AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.02k stars 92 forks source link

How do we pass prompt tuning as an adapter option to finetune.py? #14

Open ckevuru opened 1 year ago

HZQ950419 commented 1 year ago

Hi, For now, the finetune.py can only support LoRA and AdapterH, AdapterP, and Parallel Adapters. If you want to use prompt tuning, you can refer to https://github.com/huggingface/peft/blob/main/examples/causal_language_modeling/peft_prompt_tuning_clm.ipynb

Please let us know if you have further questions!