Closed l3utterfly closed 8 months ago
Tested with MobileVLM_1.4B-Chat, the above conversion process fails with the same error.
The latest version of llama.cpp
will be released soon. I will inform you once it is ready. 😸
The latest version of customized llama.cpp
: https://github.com/XiaotaoChen/llama.cpp/tree/MobileVLM-PEG.
While converting the projector, set the --projector-type
to peg
for MobileVLM v2. Any problem while using please inform us.
Same error in this issue: https://github.com/Meituan-AutoML/MobileVLM/issues/14
Followed exact steps from https://github.com/ggerganov/llama.cpp/blob/master/examples/llava/MobileVLM-README.md
The exact steps work for MobileVLM_1.7B, but not for v2