shibing624 / MedicalGPT

MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training Pipeline. 训练医疗大模型,实现了包括增量预训练(PT)、有监督微调(SFT)、RLHF、DPO、ORPO。
Apache License 2.0
3.24k stars 492 forks source link

chatglm2预训练需要多少显存,4*44G都会报oom #266

Closed xpcc355 closed 10 months ago

xpcc355 commented 10 months ago

Describe the Question

Please provide a clear and concise description of what the question is. rt

shibing624 commented 10 months ago

看下chatglm2的说明。