-
### Checklist
- [X] 1. I have searched related issues but cannot get the expected help.
- [X] 2. The bug has not been fixed in the latest version.
### Describe the bug
官方的给的python示例代码推理Minicpmv-2.5…
-
### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
### 该问题是否在FAQ中有解答? | Is there an existing ans…
-
Subscribe to this issue and stay notified about new [weekly trending repos in Python](https://github.com/trending/python?since=weekly)!
-
### What is the issue?
### run hhao/openbmb-minicpm-llama3-v-2_5:fp16
msg="error loading llama server" error="llama runner process has terminated: exit status 0xc0000409 "
time=2024-05-29T2…
-
### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
### 该问题是否在FAQ中有解答? | Is there an existing ans…
-
### Reminder
- [X] I have read the README and searched the existing issues.
### Reproduction
N/A
### Expected behavior
_No response_
### System Info
_No response_
### Others
_No response_
-
I have been using CogVLM2 run on TextVQA and I got accuracy on 4.06 while the InternVL and other models like MiniCPM got roughly 70 or 80 accuracy. I checked the model output with ground truth answer …
-
### 是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
- [X] 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions
### 该问题是否在FAQ中有解答? | Is there an existing…
-
Referring to https://github.com/OpenBMB/ollama/issues/3#issuecomment-2260209553
:-)
Thanks @tc-mb!
-