OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
11.83k stars 831 forks source link

[BUG] Error: llama runner process has terminated: signal: aborted (core dumped) #418

Open anlogo opened 1 month ago

anlogo commented 1 month ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

Modelfile

FROM ./minicpm/ggml-model-f16.gguf FROM ./minicpm/mmproj-model-f16.gguf

TEMPLATE """{{ if .System }}<|im_start|>system

{{ .System }}<|im_end|>{{ end }}

{{ if .Prompt }}<|im_start|>user

{{ .Prompt }}<|im_end|>{{ end }}

<|im_start|>assistant<|im_end|>

{{ .Response }}<|im_end|>"""

PARAMETER stop "<|endoftext|>" PARAMETER stop "<|im_end|>" PARAMETER num_ctx 2048

ollama --version

ollama version is 0.3.3

ollama run minicpm2.6

When I run , I get an error:"Error: llama runner process has terminated: signal: aborted (core dumped)"

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS: centos
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

aceliuchanghong commented 1 month ago

同问题

aceliuchanghong commented 1 month ago

同问题

Error: llama runner process has terminated: GGML_ASSERT(new_clip->has_llava_projector) failed

XuNing2 commented 3 weeks ago

same problem

tc-mb commented 3 weeks ago

Use of the official ollama is not currently supported. Please refer to our readme compilation code.