crewAIInc / crewAI-examples

2.6k stars 965 forks source link

maximum context length exceeded #121

Closed maxseminole closed 2 days ago

maxseminole commented 3 months ago

I receive an error message: openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 8192 tokens. However, your messages resulted in 8227 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}

How can I reduce the context length in the message?

Thanks.

github-actions[bot] commented 1 week ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 2 days ago

This issue was closed because it has been stale for 5 days with no activity.