Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
Error code: 400 - {'error': {'message': "This model's maximum context length is 4096 tokens. However, your messages resulted in 4777 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
I wonder if it is related with another error that I encountered and had to set memory = False.
If not, I wonder how to set the message length in crewai, in order to solve this issue.
Hello, I am following the code from the course Multi AI Agent systems with crewAI.
When I kick of the crew by doing
I am getting an error that says
Error code: 400 - {'error': {'message': "This model's maximum context length is 4096 tokens. However, your messages resulted in 4777 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
I wonder if it is related with another error that I encountered and had to set memory = False.
If not, I wonder how to set the message length in crewai, in order to solve this issue.
I am using GPT3.5 turbo..