-
**LocalAI version:**
v1.25.0
**Environment, CPU architecture, OS, and Version:**
CPU aarch64, Ubuntu 20.04
**Describe the bug**
While trying to run BUILD_GRPC_FOR_BACKEND_LLAMA=ON make bu…
-
### Checklist
- [X] I've searched for similar issues and couldn't find anything matching
- [X] I've included steps to reproduce the behavior
### Affected Components
- [ ] K8sGPT (CLI)
- [X] K8sGPT …
-
**LocalAI version:**
LocalAI commit [2addb9f](https://github.com/mudler/LocalAI/commit/2addb9f99a29a5131d2e8c0b841dfff334f9b161)
**Environment, CPU architecture, OS, and Version:**
Darwin Mac…
-
Hi,
I'm using v2.17.1 on my linux server and if I enter the WebUi with /browser, I see >19k of filter tags, which makes it impossible to use the page for a while.
Best
Frank
-
**LocalAI version: 1.22.0**
**Environment, CPU architecture, OS, and Version:**
Linux namehere 5.19.0-46-generic #47~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Wed Jun 21 15:35:31 UTC 2 x86_64 x86_6…
-
I used the .env from LocalAI, docker compiled and send to be working.
-
**LocalAI version:**
master I clone yesterday
**Environment, CPU architecture, OS, and Version:**
Host server ubuntu 22.04, Linux office 6.5.0-17-generic #17~22.04.1-Ubuntu SMP PREEMPT_DYNA…
-
- AUR Package: https://aur.archlinux.org/packages?K=localai-git
- Github Sources: https://github.com/wuxxin/aur-packages/blob/main/localai-git/PKGBUILD
this version is build for CPU, CUDA and ROCM…
-
I believe, in order to resolve https://github.com/mudler/LocalAI/pull/1446, go-llama.cpp needs to be built against at least version 799a1cb13b0b1b560ab0ceff485caed68faa8f1f of llama.cpp to enable mixt…
-
Ollama current doesn't support [Open AI Compatible Function Calling](https://github.com/ollama/ollama/issues/2790) but there are models such as [Hermes 2 Pro](https://huggingface.co/NousResearch/Herme…