microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
32.14k stars 4.68k forks source link

[Issue]: BadRequestError: Error code: 400, exceeding max content length even with context_window_handler. #3564

Open its-arsalann opened 3 weeks ago

its-arsalann commented 3 weeks ago

Facing below error even after implementing the context window handler

Library: pyautogen 0.2.31

I am using groupchat and have two agents along with some tools. I have added context window handler to both agents.

context_handling.add_to_agent(engineer) context_handling.add_to_agent(user_proxy)

GroupChat( agents=[engineer, user_proxy],.....)

below is the context window handler's setting

context_configs = { "max_msgs": 20, "model": gpt-4o, "max_tokens": 128000, "max_token_per_msg": 15000, "min_tokens": 0 }

It usually appears when the "number of tokens reduced from" count increases above 128k. below is the example "Number of tokens reduced from 139924 to 47584"

Steps to reproduce

No response

Screenshots and logs

Error log: Truncated 92340 tokens. Number of tokens reduced from 139924 to 47584 . . . ......\lib\site-packages\openai_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 144122 tokens (143706 in the messages, 416 in the functions). Please reduce the length of the messages or functions.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

Additional Information

pyautoGen Version: 0.2.31 Operating System: Window 11 Python Version: 3.10

ekzhu commented 1 week ago

@WaelKarkoub is this something you can take a look at?

Also, we are using autogen-agentchat package now and the current version is 0.2.36.

its-arsalann commented 5 days ago

@WaelKarkoub is this something you can take a look at?

Also, we are using autogen-agentchat package now and the current version is 0.2.36.

@ekzhu, Can you please briefly explain the difference between "pyautogen" (now autogen) and "autogen-agentchat" pypi projects ?