This happens when the summarize_tweets function is called on a large number of tweets (e.g. 100 +): the text becomes too long for the model to process.
This model's maximum context length is 4097 tokens. However, your messages resulted in 5435 tokens. Please reduce the length of the messages.
This happens when the
summarize_tweets
function is called on a large number of tweets (e.g. 100 +): the text becomes too long for the model to process.