AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.01k stars 91 forks source link

any code to merge the adapter weight with the original base model? #25

Open sohuren opened 1 year ago

sohuren commented 1 year ago

thanks for this great contribution, it seems the export_hf_checkpoint.py only works for lora, can you extend this to other adapters such as AdapterH/P/Parallel? thanks!

HZQ950419 commented 1 year ago

Hi,

Only LoRA parameters can be merged into the original base model. This has been demonstrated in the original LoRA paper.