danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, OpenAI, Assistants API, Azure, Groq, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.2k stars 2.87k forks source link

[Bug]: your messages resulted in 8193 tokens #934

Closed beddows closed 11 months ago

beddows commented 12 months ago

Contact Details

No response

What happened?

I get the following error intermittently:

Oops! Something went wrong. Please try again in a few moments. Here's the specific error message we encountered: Failed to send message. HTTP 400 - { "error": { "message": "This model's maximum context length is 8192 tokens. However, your messages resulted in 8193 tokens. Please reduce the length of the messages.", "type": "invalid_request_error", "param": "messages", "code": "context_length_exceeded" } }

But when I do get the error, it always exceeds the context length by one token.

Steps to Reproduce

Can never anticipate when it's going to occur.

What browsers are you seeing the problem on?

Safari

Relevant log output

No response

Screenshots

image

Code of Conduct

danny-avila commented 12 months ago

Counting tokens won't always be accurate as we can't get the API's count when streaming, but as you can see we're close! I don't run into this but I also tend to start new conversations before they get long. This isn't the first time I see this but even then it was close (within 10 tokens over max)

I think to remedy this, I could increase the margin for error and double check my token counting tests are working as intended

In the long run, you may never see this once I have #741 done.

danny-avila commented 11 months ago

I noticed today that our handling of tokens needs to be updated according to OpenAI's latest token counting examples found here: https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb