OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.98k stars 558 forks source link

finetune on NPUs #320

Closed EasonXiao-888 closed 1 day ago

EasonXiao-888 commented 1 week ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

Hi, thanks for the perfect work, I want to ask if there is an official finetune code adapted for NPUs. I have already tried to manually migrate code from GPUs to NPUs, but I have encountered many bugs.

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

LDLINGLINGLING commented 4 days ago

Hi,what is your device, We look forward to your excellent work

THUCSTHanxu13 commented 1 day ago

You can pay attention to our distributed toolkit BMTrain (https://github.com/OpenBMB/BMTrain). We will recently provide some branches running on NPUs.