ollama / ollama

Get up and running with Llama 3.2, Mistral, Gemma 2, and other large language models.
https://ollama.com
MIT License
95k stars 7.52k forks source link

Unable to run inference from web app #7380

Open MatthewDlr opened 4 hours ago

MatthewDlr commented 4 hours ago

What is the issue?

Hi, I recently tried to host this project to have a better UI to run ollama. The app successfully gets the tags at /api/tags but however, when I try to send a chat using /api/chats, the request is being rejected, and I don't know why. CleanShot X 2024-10-26 15 56 14 CleanShot X 2024-10-26 15 56 08

In ollama logs, I can see the tags request put no trace of the chat request CleanShot 2024-10-26 at 15 59 07

FYI, I tried every configuration of OLLAMA_HOST and OLLAMA_ORIGINS, I restarted the app multiple times, without success. Is it a bug, or just something I do wrong?

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.3.14

rick-github commented 1 hour ago

What do the logs in the project show?