-
Helllo, I personally find LocalAI very nice, but wish to compare it's efficiency to llama (and derivatives).
Is a windows version (with cpu only option) gonna be available soon?
-
It would be amazing if there was a way to incorporate self hosted llama. Giving users the ability to use ollama gives us the ability to really sculpt activepieces AI responses to our liking.
-
Tracker for: https://github.com/ggerganov/llama.cpp/discussions/5138 and also ROCm
- [x] Vulkan: https://github.com/mudler/LocalAI/pull/2648 (upstream https://github.com/ggerganov/llama.cpp/pull/20…
-
This is a tracker for a small overhaul in term of UX of LocalAI. This includes logging, debugging, error messages and user interaction including model galleries. There are at least enough issues to tr…
-
### Which version of assistant are you using?
latest
### Which version of Nextcloud are you using?
v28.0.4
### Which browser are you using? In case you are using the phone App, specify the…
-
**LocalAI version:**
v2.15.0 (f69de3be0d274a676f1d1cd302dc4699f1b5aaf0)
Downloaded CLI local-ai-git-Darwin-arm64
Also tried docker image
**Environment, CPU architecture, OS, and Version:**…
-
**LocalAI version:**
v2.13.0
**Environment, CPU architecture, OS, and Version:**
Intel Xeon E5-2643 v4, GNU/Linux (Unraid 6.12.10)
**Describe the bug**
Templates fail to load when located…
-
### Which version of integration_openai are you using?
1.0.13
### Which version of Nextcloud are you using?
27.1.3
### Which browser are you using? In case you are using the phone App, specify the…
-
Hi thanks again
NISQA is unmaintend for long time and this PR could be interesting to help to install on modern python
https://github.com/gabrielmittag/NISQA/pull/47
Like
https://librosa.org/d…
-
**LocalAI version:2.16.0
**Environment, CPU architecture, OS, and Version:**
mac studio M2 Ultra
**Describe the bug**
using backend transformers for glm4, trust_remote_code: true not c…