valentinfrlch / ha-llmvision

Let Home Assistant see!
Apache License 2.0
187 stars 10 forks source link

Ollama Max tokens invalid option #72

Closed starsoccer closed 1 month ago

starsoccer commented 1 month ago

It seems ollama does not support a max_tokens parameter and instead I think expects a num_predict parameter instead based on the docs here, https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values

Currently when using LLM Vision I get the below warnings in Ollama because max_tokens is set:

[GIN] 2024/10/21 - 23:34:20 | 200 | 12.535419453s |   192.168.5.100 | POST     "/api/chat" time=2024-10-21T23:35:44.031Z level=WARN source=types.go:509 msg="invalid option provided" option=max_tokens
valentinfrlch commented 1 month ago

Thank you so much! I have experienced this myself but thought it had to be something else. Will be fixed in v1.2.1.

starsoccer commented 1 month ago

Awesome I can confirm this is working no. For anyone else who sets this though be aware that atleast with ollama sentences can be cut off so it is worth updating the prompt to also include the maximum response you want. It would be nice iif there was a way to do this automatically as well.