AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.02k stars 92 forks source link

how to tune chatglm6b with your function? #7

Open Curious-chen opened 1 year ago

Curious-chen commented 1 year ago

Unable to load chatGLM

HZQ950419 commented 1 year ago

ChatGLM is not supported yet, we will add ChatGLM tmr.