Open shibingli opened 2 days ago
for llama-box, I think minicpm-v-2_6 is not supported well at present, minicpm needs a specific prompt prefix rather than llava: https://github.com/ggerganov/llama.cpp/blob/86dc11c5bcf34db2749d8bd8d4fa07a542c94f84/examples/llava/minicpmv-cli.cpp#L217-L241.
have you tried other models? I'm glad to see your feedback here.
Do vision models only support LLaVA-Phi-3-Mini? Do they support llava-v1.6-vicuna, llava-v1.6-mistral, llava-v1.5-13b, llava-v1.6-34b, and MiniCPM-V-2_6?