acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
631 stars 64 forks source link

fix: Max tokens setting for Ollama API #40

Closed fixtse closed 8 months ago

fixtse commented 8 months ago

Changed the value to the right Ollama API option (num_predict)

https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values