-
https://docs.quarkiverse.io/quarkus-langchain4j/dev/ollama.html#quarkus-langchain4j-ollama_quarkus-langchain4j-ollama-chat-model-enabled
What does "enabling a model" means? What happens when it's "of…
-
Running chat_completion on Ollama sometimes works, but mostly returns a "can't be blank" error
```
messages = [
%{role: "user", content: "Who were the first three president of the Unite…
-
### What happened?
We can't use Vision with `ollama_chat`, but it's working with `ollama`.
config.yaml
```yaml
- model_name: 'llava:7b'
litellm_params:
model: 'ollama_chat/llava:7b…
-
### Extension
https://www.raycast.com/massimiliano_pasquini/raycast-ollama
### Raycast Version
1.74.1
### macOS Version
14.4.1
### Description
I have Ollama running on my home server thru Docke…
-
With ollama-python 0.3.0 and the latest ollama server, I'm getting systematically an exception raised, even with the basic chat example provided (e.g. `examples/chat/main.py`).
### Code to reproduc…
-
Hi, thanks for. sharing such a good work. To my understanding , The llms are used to extract entities . I checked this code:https://github.com/TheAiSingularity/graphrag-local-ollama/blob/main/graphrag…
-
### Issue
I am trying to use a model file to quiet down warning messages according to [https://aider.chat/docs/config/adv-model-settings.html](https://aider.chat/docs/config/adv-model-settings.html).…
-
Can you support Ollama function calling in the same way as lang chain? Thanks.
Link to relevant docs:
https://js.langchain.com/v0.2/docs/integrations/chat/ollama#tools
-
**Bug Description**
ollama llava:7b is a vision model ( https://ollama.com/library/llava ), but chatbox said it's not.
**how to reproduce**
chat with an image
-
Hi!
There's a relatively new ollama chat API:
https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-chat-completion
It works in a very similar way to openAi chat api:
```
cur…