A Langchain app that allows you to chat with multiple PDFs
1.66k
stars
940
forks
source link
InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages. #37
Problem:
InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages.
Fix:
def get_text_chunks(text):
text_splitter = CharacterTextSplitter(
separator="\n",
chunk_size=500, # the fix is to change from 1000 to 500
chunk_overlap=200,
length_function=len
)
Problem: InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 6777 tokens. Please reduce the length of the messages.
Fix: def get_text_chunks(text): text_splitter = CharacterTextSplitter( separator="\n", chunk_size=500, # the fix is to change from 1000 to 500 chunk_overlap=200, length_function=len )