Open anlogo opened 1 month ago
同问题
同问题
Error: llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed
same problem
Use of the official ollama is not currently supported. Please refer to our readme compilation code.
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
Modelfile
FROM ./minicpm/ggml-model-f16.gguf FROM ./minicpm/mmproj-model-f16.gguf
TEMPLATE """{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>{{ end }}
{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>{{ end }}
<|im_start|>assistant<|im_end|>
{{ .Response }}<|im_end|>"""
PARAMETER stop "<|endoftext|>" PARAMETER stop "<|im_end|>" PARAMETER num_ctx 2048
ollama --version
ollama version is 0.3.3
ollama run minicpm2.6
When I run , I get an error:"Error: llama runner process has terminated: signal: aborted (core dumped)"
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
备注 | Anything else?
No response