-
Hello!
First I just wanted to say thank you for this, I saw your post on Reddit and this looks like such a fun project!
Looking through the code, it looks modular enough that it shouldn't be too…
-
Ollama is OpenAI-compatible so it should work out of the box using the OpenAI provider (by overriding the base URL).
We should explicitly test and document it to ensure there are no unexpected diff…
-
### Issue
I am receiving the following error when attempting to use my local Llama instance of qwen2.5-coder-32b-instruct. (Ubuntu 24.04) When starting aider and pointing to my ollama instance with…
-
Currently, the Ollama client does not support tools (cf. https://github.com/lmos-ai/arc/blob/main/arc-ollama-client/src/main/kotlin/OllamaClient.kt#L94). Ollama itself, in principle, supports tools, n…
-
### Describe the bug
then i use ollama locally it shows my model then i wanna use it over the internet it´s empty
### Link to the Bolt URL that caused the error
https://github.com/coleam00/bolt.ne…
-
### Issue
Using aider.chat with ollama, got
```
Pourlitellm.APIConnectionError: Ollama Error - {'error': 'an unknown error was encountered while running the model '}
Traceback (most recent ca…
-
Do you support Ollama llms and embeddings for kg_builder.
At a glance it looks like the embedder could be swapped out easily with an Ollama embedder, not sure how llm work with KG pipeline? Could y…
-
ollama webui
wrx89 updated
1 month ago
-
### Reference Issues
_No response_
### Summary
Currently, Ollama URL is hardcoded to `http://localhost:11434/api`
This becomes an issue when deploying Kotaemon in more advanced scenarios…
-
### Description
Setup Ollama + bigAGI fresh install. I have followed the basic steps. Ollama has been verified to work with other apps. I added Ollama to the bigAGI as a service provider and can see …