Vali-98 / ChatterUI

Simple frontend for LLMs built in react-native.
GNU Affero General Public License v3.0
591 stars 31 forks source link

Completions missmatch #79

Closed 4riw closed 2 months ago

4riw commented 2 months ago

I'm using version 0.7.10, and it seems there's a missmatch between text completions and chat completions.

When I used the general chat completions APi, the sampler settings page didn't allow me to set the maximum number of tokens, and the payload didn't include the chat history. On the other hand, when I used the text completions API, it did what chat completions should do.

Vali-98 commented 2 months ago

This is due to a small quirk with how Chat Completions works.

The OpenAI spec does not allow for the max_context_length field, and for compatibility reasons. ChatterUI instead relies on the maximum context passed in from the v1/models endpoint, this ensures that the context is never above the context limits of said endpoint.

4riw commented 2 months ago

I mean, on version 0.7.10 the chat completions didn't fill up the JSOn body with chat history, so the AI lost the conversation context at 3rd turn. While the text completions, when it's supposed to answer single prompt, yet the json body filled up with chat history. Here is the log. logs (1).txt