Open vvstubbs opened 1 week ago
Have you switched to local mode in the UI?
Well I do feel a bit stupid now, In my defense I was understanding that "switch" the opposite way around, but now it is using my local ollama, but after a while it returns 500: and nothing elese, I'm resuming a time out
Hmm, that's weird. Can you try running ollama serve
, then making a query from Farfalle. There might be something in the ollama serve
logs to look at.
As the title indicates, it's says I don't have openAI quota, but I'm not wanting to use openAI or anything else outside of my own system. I have a local OLLAMA, with many local models, which works with standard chat clients, I have SearXNG local, everything is to me in house, except for the need to access the internet for search. below is my my compose file. My computer network has everything local anything could ever need, latest ollama engine, 2 types of databases, redis, memcache, etc, all running bare metal local, I don't understand where I'm going wrong
the .env file
the response from :http://192.168.0.191:11434