-
Mistral (known for their [7B model](https://mistral.ai/news/announcing-mistral-7b/) and more recently their [Mixture of Experts model](https://mistral.ai/news/mixtral-of-experts/)) have recently start…
-
Hi :wave: !
Very nice project! any plans/interest in running local models, with something like LocalAI? https://github.com/go-skynet/LocalAI
I'd be happy to take a stab at it if there is intere…
-
### Feature request
Integration with LocalAI and with its extended endpoints to download models from the gallery.
### Motivation
LocalAI is a self-hosted OpenAI drop-in replacement with support for…
-
### Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to [Discussions](https://github.com/langgenius/dify/discussions/categories/general).
- [X] I have s…
-
Presently it is very hard to get a docker container to build with the rocm backend, some elements seem to fail independently during the build process.
There are other related projects with functiona…
-
I was building `localai-git` with `_ENABLE_CUDA=0 _ENABLE_ROCM=1 _ENABLE_CPU=1`.
After finishing build, `namcap` tells me that the following packages miss the following dependencies:
* `localai-…
-
`docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
`
`curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{ "model": "gpt-4", "messages"…
Zibri updated
5 months ago
-
Right now `/v1` is part of ENDPOINT.
But it is [more common to have BASE_URL](https://github.com/openai/openai-python?tab=readme-ov-file#configuring-the-http-client) (or API_BASE) setting, which incl…
-
**Describe the bug**
I've recently upgraded to release v0.3, running Home Assistant 2024.6.1 in a Docker container. I have reconfigured my models as per the documentation and reading through the so…
-
### Feature/Improvement Description
I've been using Localai (as an openai replacement) to serve multiple local models for inferencing and came across the idea of BNF grammars in llamacpp. I recall so…