OpenBMB / MiniCPM-V

MiniCPM-Llama3-V 2.5: A GPT-4V Level Multimodal LLM on Your Phone
Apache License 2.0
7.82k stars 543 forks source link

[BUG] Pre-training unstable #309

Open orrzohar opened 1 week ago

orrzohar commented 1 week ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

I am trying to pre-train MiniCPM-V-LLaMA3. I tried the following:

期望行为 | Expected Behavior

Stable training

复现方法 | Steps To Reproduce

  1. Clone this repo
  2. download image pre-training datasets
  3. do pretraining

运行环境 | Environment

- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response