JHubi1 / ollama-app

A modern and easy-to-use client for Ollama
Apache License 2.0
597 stars 45 forks source link

moondream LLM returns error with ollam-app #47

Closed fliker09 closed 1 week ago

fliker09 commented 1 month ago

Requirements

Platform

Android

Description

moondream LLM returns error with ollam-app

Steps to reproduce

  1. Run ollama with moondream LLM
  2. Choose the image
  3. Give the prompt
  4. Send the prompt Screenshot_20240928-023254

Expected behavior

A response from moondream LLM.

Actual behavior

It thinks for a long time (minutes) and return the error. Screenshot_20240928-023419

Screenshots or additional context

No error seen on ollama side, except for this warning: "multimodal models don't support parallel requests yet".

JHubi1 commented 1 month ago

Server issue without actually having one suggests it being a timeout error. Please open Settings > Interface > Timeout multiplicator and set it to 10 for testing. Does that solve the issue?

fliker09 commented 1 month ago

It actually... helped! And the weirdest thing - it didn't take longer than to show the error like previously! Btw, adding timestamps to the messages would be a nice addition ^_^ Ok, it works now, but found a different issue - it seems that the second image is not sent (or maybe it's LLM's issue?): Screenshot_20240928-115553 I had to open a new chat for it to work properly: Screenshot_20240928-115805

JHubi1 commented 1 month ago

Please open a new issue, this isn't related. And close this one if the problem is solved