Closed SAI-sentinal-ai closed 1 year ago
Hey @harnoor-saini, that's going to be a function of how the model was trained: most of the newer LLaMA-based models like GPT4All and Alpaca are fine-tuned on question/answer type input to make them more suited to a chat-like experience, so there's not much we can do here. Might be worth experimenting with some of the other models, and I'm going to be adding support for tweaking some of the model hyperparameters soon, which might affect this too.
Hi thanks for this, I am using with GPT4ALL. How do I control how long the answers are? currently they are quite short...