Closed iamyihwa closed 2 months ago
Can you try gpt-4o and see if you get the same error? thanks
Latest version. crewai 0.30.11 crewai-tools 0.2.6
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
Hello, I am following the code from the course Multi AI Agent systems with crewAI.
When I kick of the crew by doing
I am getting an error that says
Error code: 400 - {'error': {'message': "This model's maximum context length is 4096 tokens. However, your messages resulted in 4777 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
I wonder if it is related with another error that I encountered and had to set memory = False.
If not, I wonder how to set the message length in crewai, in order to solve this issue.
I am using GPT3.5 turbo..