nextcloud / llm

A Nextcloud app that packages a large language model (Llama2 / GPT4All Falcon)
24 stars 2 forks source link

Feature Request: allow to customize output size #45

Open blizzz opened 11 months ago

blizzz commented 11 months ago

… for sometimes the default value is too small and you receive an answer that is cut off, which seems weird and buggy, if you do not know about the background. Maybe set a higher default value, too?

n_predict seems to be the parameter to adjust.