-
configuration:
list models went well:
using a ollama model ends with "Model not found" and **hangs** there:
seems like the ollama url used the default value(http://127.0.0.1:11434…
-
Running chat_completion on Ollama sometimes works, but mostly returns a "can't be blank" error
```
messages = [
%{role: "user", content: "Who were the first three president of the Unite…
-
### Describe your issue
I am trying to get Bionic to connect to ollama on windows but I can't seem to get it configured correctly.
- Ollama works at the `http://localhost:11434/v1/chat/completions…
-
### Extension
https://www.raycast.com/massimiliano_pasquini/raycast-ollama
### Raycast Version
1.74.1
### macOS Version
14.4.1
### Description
I have Ollama running on my home server thru Docke…
-
config:
```lua
return {
"huggingface/llm.nvim",
opts = {
model = "rouge/autocoder-s-6.7b:latest",
backend = "ollama",
url = "http://localhost:11434/api/generate",
request_…
-
-
Hi there,
I recently stumbled upon your paper, and Phudge looks great! I was wondering if you considered adding it to ollama so that it can be used in an efficient manner (ollama is nice because y…
rasbt updated
2 weeks ago
-
-
Hi guys!
My code stopped to work and now I am receiving the error:
**ModuleNotFoundError: No module named 'ollama'**
I am running on Google Collab and below you can see the part with problem:…
-
TLDR: Add [Ollama](https://ollama.com/) Component to Aspire similar to the [OpenAI](https://learn.microsoft.com/en-us/dotnet/aspire/azureai/azureai-openai-component?tabs=dotnet-cli) component.
## C…