mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
https://localai.io
MIT License
26.32k stars 1.97k forks source link

Issues running a few Models. #4231

Open michieal opened 2 days ago

michieal commented 2 days ago

LocalAI version:

2.30.0

Environment, CPU architecture, OS, and Version:

6.11.0-9-generic # 9-Ubuntu SMP PREEMPT_DYNAMIC Mon Oct 14 13:19:59 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux (Kubuntu 24,10) AMD Radeon Video card, but I don't have ROCm installed. (Mainly, because this version doesn't work with the AMDGPU-dkms... and it breaks my graphics. KDE Plasma.)

Describe the bug

My main issue is that the phi models just don't work at all. (The Vision capable models. like phi-3.5-vision:vllm or openvino-phi3.) Additionally, the image generation models are all "GPU" and since I am trying to do it with CPU, those don't work.

To Reproduce

Use the web interface. Install, say, phi-3.5-vision:vllm, and try to use it. Note the lack of the "processing" indicator.

Expected behavior

I expected some kind of output... even if it was a descriptive error message.

Logs

don't know how to do that?

Additional context

I was really looking forward to the use of images, both in analyzation and generation. My video card doesn't have enough vram to hold the model's data, so even if I do have rocm installed, it would then return an out of memory error. I'm more than willing to wait for the image generations, as it doesn't have the GPU assist. I can say that poppy_porpoise-v1.4-l3-8b-iq-imatrix, however, works just fine at understanding images. So, at least I have a model with "vision"...

Additionally, I apologize if this is a duplicate issue, as a programmer I know what it's like to have duplicate issues on your project repo. TIA!