PhoebusSi / Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts to initiate any meaningful PR on this repo and integrate as many LLM related technologies as possible. 我们打造了方便研究人员上手和使用大模型等微调平台,我们欢迎开源爱好者发起任何有意义的pr!
Apache License 2.0
2.63k stars 248 forks source link

fine-tuned结束后,运行chatglm报错 #45

Open StarRanger opened 1 year ago

StarRanger commented 1 year ago

Traceback (most recent call last): File "/root/llm/Alpaca-CoT-main/app.py", line 15, in from model_chatglm import ChatGLMForConditionalGeneration, ChatGLMTokenizer ModuleNotFoundError: No module named 'model_chatglm' 这个需要pip什么安装包吗?

PhoebusSi commented 1 year ago

先用generate.py来测试chatglm吧 app.py稍后集成进chatglm

StarRanger commented 1 year ago

ChatGLM-6B官网最新的P-Tuning还挺靠谱的,试着fine-tune了一下,可以正常使用,有空可以集成进来了。

StarRanger commented 1 year ago

https://github.com/THUDM/ChatGLM-6B/blob/main/ptuning/README.md

PhoebusSi commented 1 year ago

P-Tuning的集成已加入ToDo list,敬请期待~