Open unclemusclez opened 1 week ago
Hello, can you give more information on this issue?Is there any useful output in the dialog window at the end, or could you please look at the server-side log of the model to determine whether the error came from not passing context to the model or some other part? I don't know much about NGINX. In the previous setting about timeout #378 I briefly set the timer for 25 seconds from the time of the request, maybe it has something to do with that? You can try pressing Refresh in the upper right corner of the error message to see if any replies have been received after the timeout.I'm sorry if the error did come from a timeout that was too short, causing an error box to appear before a reply was received
Number characters in all messages = 39382
Not sure if this is what is causing the issue, but longer context seem to timeout.
At the moment, I've selected all of my code, and then used my own custom template
Explain
.prompting "Test" in the chat bot seems to work, but when I use excessive prompts, it will timeout. The entire code is aobut 1000 lines currently of Python, seemingly equating to
1984.2265625kb
.I am using Ollama with Twinny, and this connects over NGINX. This may be an area to inspect for the timeout, however, the instruct bot will connect and communicate, but the larger windows will trigger errors.