pashpashpash / vault-ai

OP Vault ChatGPT: Give ChatGPT long-term memory using the OP Stack (OpenAI + Pinecone Vector Database). Upload your own custom knowledge base files (PDF, txt, epub, etc) using a simple React frontend.
https://vault.pash.city
MIT License
3.25k stars 306 forks source link

Increase token length of 8191 #28

Closed paOol closed 1 year ago

paOol commented 1 year ago

after increasing MAX_FILE_SIZE and MAX_TOTAL_UPLOAD_SIZE in fileupload.go , I uploaded an epub and after a while ran into this error.

The book is 45mb.

^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c2023/04/20 09:01:19 [UploadHandler ERR] Error getting embeddings: error, status code: 400, message: This model's maximum context length is 8191 tokens, however you requested 8230 tokens (8230 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
[negroni] Apr 20 08:54:39 | 200 | 6m40.613778887s
          POST /upload

is there a way to increase the amount of tokens, or chunk them appropriately for larger files?

pashpashpash commented 1 year ago

@paOol this should be fixed now https://github.com/pashpashpash/vault-ai/commit/2bff1759dc64e0ae16be23c410f20b14c725c1ec