THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
39.98k stars 5.15k forks source link

[Help] 请问chatglm支持像llama一样的continual pretraining吗 #1309

Open zoepo opened 1 year ago

zoepo commented 1 year ago

Is there an existing issue for this?

Current Behavior

有看到llama支持continual pretraining,想问一下chatglm能支持这种继续预训练方式吗?

Expected Behavior

No response

Steps To Reproduce

Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

No response

tomcat123a commented 11 months ago

https://github.com/shibing624/MedicalGPT 参考这个项目,预训练,指令微调,rm模型训练,ppo都有现成的。