OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.98k stars 558 forks source link

finetune_lora.sh use ds_config_zero3.json error #184

Closed zhangchaosunshine closed 1 month ago

zhangchaosunshine commented 1 month ago

I want to use ds_config_zero3.json to reduce GPU memory.I only have 424G GPU memory.But error occurred. ` File "/home/cce/work/.cache/huggingface/modules/transformers_modules/MiniCPM-Llama3-V-2_5/resampler.py", line 152, in forward x + pos_embed, # L B D + L B * D RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:1 and cpu!` can you provide finetune_lora.sh about zero3.

shituo123456 commented 1 month ago

so do i

qyc-98 commented 1 month ago

hi we have updated our code , you can try it again

zhangchaosunshine commented 1 month ago

hi we have updated our code , you can try it again

Thanks a lot. good work.