OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
12.71k stars 891 forks source link

[BUG] <title> 使用llama.cpp遇到 Missing required key: general.description #656

Closed friendmine closed 2 weeks ago

friendmine commented 2 weeks ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

使用 https://huggingface.co/openbmb/MiniCPM-V-2_6/tree/main 模型转换成gguf后, 运行提示: key general.description not found in file terminate called after throwing an instance of 'std::runtime_error' what(): Missing required key: general.description Aborted (core dumped)。 查看转换后的gguf( ./gguf-py/scripts/gguf-dump.py),发现时边确实没有相关的meta信息。

期望行为 | Expected Behavior

在转换或者原始的model中应该提供llama.cpp的适配的meta信息。

复现方法 | Steps To Reproduce

1.下载 https://huggingface.co/openbmb/MiniCPM-V-2_6/tree/main 中的文件

  1. 参考 https://github.com/OpenBMB/llama.cpp/blob/minicpmv-main/examples/llava/README-minicpmv2.6.md 进行格式转换。
  2. 参考上面的指南运行转换后gguf,出现上面的错误

运行环境 | Environment

- OS:Ubuntu 22.04.5 LTS
- Python:3.12.7
- Transformers: 4.46.2
- PyTorch:2.5.1
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):None

备注 | Anything else?

No response

friendmine commented 2 weeks ago

使用错误,未使用 mmproj-model-f16.gguf 在参数 -mmproj 中应该指定 mmproj-model-f16.gguf