-
Love this plugin so far! Streaming works well with Ollama, but I'd love to be able to use this with LM Studio to enable streaming on Windows too with local LLMs. I have found LM Studio server setup to…
ckep1 updated
5 months ago
-
- [x] go-llama.cpp
- [ ] gpt2
- [ ] gpt4all
- [ ] rwkv (?)
-
Hi! I connected a model from localai.io to dialoqbase, but I get the responses from the model truncated in the chat.
Thanks!
[App] provider local
[App] modelName …
phpia updated
6 months ago
-
I would like to ask (if not already planned) to support Mixtral from MistralAI
https://docs.mistral.ai/models/
Thank you
muka updated
6 months ago
-
-
I have already deployed LocalAI with OpenAI-compability interface. How can I use Zep with it?
-
Hi! Awesome project :)
Any plans to support [TTS by OpenAI](https://platform.openai.com/docs/guides/text-to-speech) as well?
-
https://github.com/Mintplex-Labs/anything-llm/blob/5ad8a5f2d0c545a913319a641877d70aa7b82b09/server/utils/EmbeddingEngines/localAi/index.js#L16
Why is this not configurable using `EMBEDDING_MODEL_CH…
-
**LocalAI version:**
commit 67966b623cd92602406057ce4214577e0a00197d
**Environment, CPU architecture, OS, and Version:**
```bash
AMD Ryzen 7 7840HS with Radeon 780M Graphics
Linux cappuccin…
-
Hi!
There is mention in BionicGPT docs here https://bionic-gpt.com/docs/administration/external-api/
that I can remove base LLM (llama-7b), avoiding it from download
So I've deleted following f…