试着将chat.py#L30-L31 if False 改成if True 进行OmniLMM12B多卡推理,但是出现大批UserWarning: for lm_head.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass assign=True to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
最后出现设备不一致错误
RuntimeError: Tensor on device cuda:0 is not on the expected device meta!错误
试着将chat.py#L30-L31 if False 改成if True 进行OmniLMM12B多卡推理,但是出现大批UserWarning: for lm_head.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass
assign=True
to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?) warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta ' 最后出现设备不一致错误 RuntimeError: Tensor on device cuda:0 is not on the expected device meta!错误