Closed OpenSource03 closed 1 year ago
Set this to a big number:
-n N, --n_predict N number of tokens to predict (default: 200)
You can also say to the AI "tell me more" or something like that and it usually expands the answer. Maybe you also need to say in the prompt that you want a long answer.
I should also say that many of the options should work exactly the same as in llama.cpp or gpt4all.
Hope this helps. :)
This looks like it works now. Please reopen this issue if I made any mistakes :)
This looks like it works now. Please reopen this issue if I made any mistakes :)
Hi, it does. Thank you very much for your help. I just had to ask it to generate longer response.
Hi!
How would one make it generate longer results? I need it to generate very long answers, but it always stops after a few lines.