Open Ratinod opened 3 months ago
Can you add support for .gguf version of MoonDream2? Currently .gguf gives an error with "LLava Clip Loader".
moondream2-mmproj-f16.gguf moondream2-text-model-f16.gguf
Specialized MoonDream2 node gives me a CUDA error (8GB VRAM not enough?).
They added different classes to llama.cpp for nanollava and moondream i will add them.
Can you add support for .gguf version of MoonDream2? Currently .gguf gives an error with "LLava Clip Loader".
Specialized MoonDream2 node gives me a CUDA error (8GB VRAM not enough?).