Open quinncomendant opened 1 month ago
Same request for another LLM API or openai compatible API
ollama already supports those models. Enchanted enables you to query ollama API, and ollama itself is able to either run a model locally (Llama 8b, ...) or query external parties (Anthropic, OpenAI, ...).
@bhenriq, can you share a link or example of how to use Ollama to query online models? I searched and didn't find any mention of using Ollama as a proxy for remote models. In fact, it's docs say, “Ollama runs locally, and conversation data does not leave your machine.”
@quinncomendant i have double checked and it looks like my answer was incorrect. my apologies for that. maybe indeed a capability that enchanted could provide itself.
my confusion came from the fact that i am using another ai client named aichat. that client can invoke both local models (via ollama) and third-party ones. very much what you are asking for. but that ai client is for desktop only.
I'd like to use Enchanted with Anthropic’s chat API. I noticed there is already some discussion of supporting the APIs for ChatGPT (#139) and Mistral (#25). I think it would be simple and very useful to support additional chat APIs for those of us who like to experiment with different models. 🙏
Enchanted looks like the best open-source app for general LLM chat support. Looking forward to seeing it evolve!