-
Hello,
I have a localai-2.0.0 instance running but impossible to connect through extended_openai_conversation.
I have a systematic Connection error.
Everything seems to be configured correctly.
C…
-
Hello - I'm unable to get this integration to work with a localai server. The server itself works just fine but as soon as I try and call it from home assistant with this integration I get the followi…
-
**LocalAI version:**
[Commit 2bacd0180d409b2b8f5c6f1b1ef13ccfda108c48](https://github.com/go-skynet/LocalAI/commit/2bacd0180d409b2b8f5c6f1b1ef13ccfda108c48)
**Environment, CPU architecture, OS…
-
I am trying to make context-aware chat, which would work with both openAI and LocalAI (https://github.com/mudler/LocalAI)
I have this code:
```go
session, err := InitializeNewChatW…
-
I installed everything step by step.
I tried an seperated Container but same there.
I get following message when running the autogpt4all.py or .sh
```bash
root@d2c36eb3a44c:/home/autogpt4all# …
-
It would be amazing if there was a way to incorporate self hosted llama. Giving users the ability to use ollama gives us the ability to really sculpt activepieces AI responses to our liking.
-
**LocalAI version:**
v2.15.0 (f69de3be0d274a676f1d1cd302dc4699f1b5aaf0)
Downloaded CLI local-ai-git-Darwin-arm64
Also tried docker image
**Environment, CPU architecture, OS, and Version:**…
-
### Describe the feature you'd like to request
Currently, this integration app only supports configuring a single AI service provider. This limits users who want to utilize different APIs for specifi…
-
### Describe the feature you'd like to request
Currently you need to deselect features like image generation if the endpoint in your LocalAI is not available, because otherwise users are able to try …
-
### Feature request
Adding a OpenAI url setting allows integration with https://localai.io/, a locally-run API that is compatible with the OpenAI API specification.
### Why?
Running the AI server …
0x326 updated
5 months ago