JHubi1 / ollama-app

A modern and easy-to-use client for Ollama
Apache License 2.0
597 stars 45 forks source link

Request Failed. Server issues #40

Closed NuclearDuck13 closed 2 months ago

NuclearDuck13 commented 3 months ago

I have Ollama running on my Windows machine and ngrok running with a static url. I use that url in the settings of the app and it says it's good, I check the url on multiple devices and it says "Ollama is running!", but when I select the model in the app and send a message I get "Issue: Request Failed. Server issues". Is this a problem with my pc setup or my phone?

JHubi1 commented 3 months ago

After selecting a model or before the models are displayed?

NuclearDuck13 commented 3 months ago

After selecting the model. I select llama3.1 and try sending a message and that's the error I get.

JHubi1 commented 3 months ago

Have you tried any other model?

NuclearDuck13 commented 3 months ago

Just tried with phi3, same issue

JHubi1 commented 2 months ago

Could you download the latest version, v1.2.0 and set the timeout multiplier (Settings > Interface) to ten? Does that solve the issue?

NuclearDuck13 commented 2 months ago

Here are the thing I did that fixed the problem (not sure which did what but it's working now):

Updated everything (app, ngrok, ollama) Set timeout multiplier to 10 Appended "--host-header="localhost:11434" to my ngrok run command (this is very likely what fixed it)