-
### Is your feature request related to a problem? Please describe.
"The problem" I encounter is that the files like images which created are created by root.
When mounting the output to the host,…
-
**LocalAI version:**
localai/localai:v2.17.1-cublas-cuda12
**Environment, CPU architecture, OS, and Version:**
Linux sphinx 6.5.0-28-generic #29~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Thu Apr 4 …
-
Tracker for: https://github.com/ggerganov/llama.cpp/discussions/5138 and also ROCm
- [x] Vulkan: https://github.com/mudler/LocalAI/pull/2648 (upstream https://github.com/ggerganov/llama.cpp/pull/20…
-
Tools/functions should be supported in streaming mode. Currently, they work only in sync mode.
-
There are many parts of the WebUI that can be improved, I'm trying to create a tracker here to collect some thoughts and areas that needs improvements, for instance:
- [x] model card description: (…
-
**LocalAI version:**
[localai/localai:v2.20.1-cublas-cuda12](https://hub.docker.com/layers/localai/localai/v2.20.1-cublas-cuda12/images/sha256-76e5637be78c1904046ef99f708dbefaa068b90a27cb335587…
-
Just thinking aloud.
How easy would it be to separate the functionality of the backend and abstract it away to a hosting tool like localai?
I have been investigating methods for consolidating re…
-
**Is your feature request related to a problem? Please describe.**
Llama3.2 was released, and as it has multimodal support would be great to have it in LocalAI
**Describe the solution you'd li…
-
**LocalAI version:**
I am using the docker image from dockerhub `localai/localai:lastest-cp` but I don't think it matters.
**Environment, CPU architecture, OS, and Version:**
`Linux archserve…
-
**LocalAI version:** I used the script installer just now
**Environment, CPU architecture, OS, and Version:**
`Linux nacia 6.8.0-45-generic #45~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Wed Sep 11 …
K0-RR updated
1 month ago