yvann-ba / Robby-chatbot

AI chatbot 🤖 for chat with CSV, PDF, TXT files 📄 and YTB videos 🎥 | using Langchain🦜 | OpenAI | Streamlit ⚡
MIT License
766 stars 287 forks source link

Maximum context length error #25

Closed deedeeharris closed 1 year ago

deedeeharris commented 1 year ago

Hey, Thanks for sharing your project.

I tried uploading a few csv files, though with all them I got the following error: Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 6317 tokens. Please reduce the length of the messages.

That's after uploading the file, and prompting "Hello".

Does your script chunk the uploaded csv file?

Thanks, Yedidya

yvann-ba commented 1 year ago

Hey, the CSVLoader cuts each line separately, maybe if you have a lot of text in one line the error can appear, I never had this problem, do you have this error even for a smaller file like the sample file?

deedeeharris commented 1 year ago

Thanks for the quick response. A sample csv works. I guess you're write, my records have multiple features with a lot of a text. Do you have an idea how to solve this?

yvann-ba commented 1 year ago

Yes, you can chunk after csvloader, it will index less well because some rows will be cut and therefore separated but apart from that I do not see how it would be possible, or else just do without csvloader and use directly CharacterTextSplitter by setting a limit of tokens to 4000