OpenBMB / MiniCPM-V

MiniCPM-V 2.6: A GPT-4V Level MLLM for Single Image, Multi Image and Video on Your Phone
Apache License 2.0
12.62k stars 888 forks source link

[llamacpp] - <title>Llama.CPP HTTP server. #457

Open haoyu-zhao opened 3 months ago

haoyu-zhao commented 3 months ago

起始日期 | Start Date

No response

实现PR | Implementation PR

No response

相关Issues | Reference Issues

No response

摘要 | Summary

Hi, I am trying to load this using Llama.CPP HTTP server. However, it failed to return the correct answer when called. Is this method not supported now? image

基本示例 | Basic Example

llama.cpp api server

缺陷 | Drawbacks

No Drawbacks

未解决问题 | Unresolved questions

No response

tc-mb commented 3 months ago

Yes, I haven't taken the time to adapt this part of llamacpp yet.

haoyu-zhao commented 3 months ago

If possible, I would greatly appreciate it if you could prioritize fixing this issue at your earliest convenience. This feature is particularly important for my current work, and having it operational would significantly aid in my progress. Thank you very much for your understanding and assistance!