ahmedbesbes / media-agent

Scrape data from social media and chat with it using Langchain
130 stars 18 forks source link

Fix token limit when generating summary #1

Closed ahmedbesbes closed 1 year ago

ahmedbesbes commented 1 year ago

This happens when the summarize_tweets function is called on a large number of tweets (e.g. 100 +): the text becomes too long for the model to process.

This model's maximum context length is 4097 tokens. However, your messages resulted in 5435 tokens. Please reduce the length of the messages.