Mobile-Artificial-Intelligence / maid_llm

maid_llm is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid)
MIT License
43 stars 9 forks source link

Support LLaVA models with llama cpp #5

Open nonetrix opened 5 months ago

nonetrix commented 5 months ago

Would be neat if this supported llava models since seems some 7B RP models are starting to be able to do it

e.g. https://huggingface.co/Lewdiculous/Eris_PrimeV3-Vision-7B-GGUF-IQ-Imatrix