THUDM / ChatGLM2-6B

ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Other
15.65k stars 1.85k forks source link

[BUG/Help] <title>ptuning微调后想要在此基础上进一步微调 #635

Open fan-xh opened 7 months ago

fan-xh commented 7 months ago

Is there an existing issue for this?

Current Behavior

我已经完成了一次ptuning,现在又有一批新的数据想要在上一轮微调的基础上再次微调

Expected Behavior

基础权重和ptuning后的权重合并

Steps To Reproduce

按照readme的步骤可以完成第一轮微调,但是后续不知道如何导出新的模型权重

Environment

- OS:Ubuntu 22.04
- Python:3.10
- Transformers:4.30.2
- PyTorch:2.0.0+cu118
- CUDA Support True

Anything else?

No response