OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
12.76k stars 894 forks source link

是否能通过多GPU实现推理加速? #610

Open sunyclj opened 1 month ago

sunyclj commented 1 month ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

单GPU推理一张耗时约2s,请问如何通过多GPU进行推理加速呢?

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

LDLINGLINGLING commented 1 month ago

你好,transformer多卡推理的代码在这里,https://modelbest.feishu.cn/wiki/LZxLwp4Lzi29vXklYLFchwN5nCf#share-ZZq8dLhzbosu4MxbdqQcqUa8nzh