OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.86k stars 547 forks source link

12B多卡推理出错 #261

Closed SKY072410 closed 3 weeks ago

SKY072410 commented 3 weeks ago

试着将chat.py#L30-L31 if False 改成if True 进行OmniLMM12B多卡推理,但是出现大批UserWarning: for lm_head.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' 最后出现设备不一致错误 RuntimeError: Tensor on device cuda:0 is not on the expected device meta!错误

iceflame89 commented 3 weeks ago

请按issue template 补充更详细的环境信息

Cuiunbo commented 3 weeks ago

需要更充分上下文信息来分析issue情况,请重新按照issue模板提供更详细情况方便复现您错误原因以便更快解决问题