henrycunh / golem

✨ A beautiful UI for ChatGPT and other conversational models
https://golem.chat
MIT License
235 stars 63 forks source link

Bug Report: No response on tokens overflow #22

Closed gdassori closed 1 year ago

gdassori commented 1 year ago

When setting the max token to N, if prompt+N > 4001 (model limit), there is an error while communicating w/ OpenAI, and the chatbox doesn't notify anything.

How to reproduce: Set the token limit to 4000 and start a new chat with any prompt.

Suggestion: Use tiktoken (https://www.npmjs.com/package/@dqbd/tiktoken) to count the input tokens and determine a proper max_token value that should be model_max_value - input_message_tokens_count