Closed 4riw closed 2 months ago
This is due to a small quirk with how Chat Completions works.
The OpenAI spec does not allow for the max_context_length field, and for compatibility reasons. ChatterUI instead relies on the maximum context passed in from the v1/models
endpoint, this ensures that the context is never above the context limits of said endpoint.
I mean, on version 0.7.10 the chat completions didn't fill up the JSOn body with chat history, so the AI lost the conversation context at 3rd turn. While the text completions, when it's supposed to answer single prompt, yet the json body filled up with chat history. Here is the log. logs (1).txt
I'm using version 0.7.10, and it seems there's a missmatch between text completions and chat completions.
When I used the general chat completions APi, the sampler settings page didn't allow me to set the maximum number of tokens, and the payload didn't include the chat history. On the other hand, when I used the text completions API, it did what chat completions should do.