Closed cegutica closed 1 month ago
I think that only happens after llama.cpp merge this fork code into its main branch, then we can expect llama-cpp-python add model support on their end.
As I see it is already supported by llama.cpp: https://github.com/ggerganov/llama.cpp/tree/master#:~:text=model.md)-,Multimodal%20models%3A,-LLaVA%201.5%20models
What about llama-cpp-python ?
@all Hi, I don't always pay attention to the issue area in this fork code repo.
Prerequisites
Feature Description
Hi, I have tested the
MiniCPM-Llama3-V-2_5
model with my own data using the code provided in this repository and it works really nice. I saw you have also uploaded quantified versions of this model here and I would like to try the model using the library llama-cpp-python. Is there a way to do so currently? If not, it would be very useful that this model can be run with this library as well.Motivation
This library provides a simple way to use models in GGUF format and is very popular in the community.
Possible Implementation
No response