OP Vault ChatGPT: Give ChatGPT long-term memory using the OP Stack (OpenAI + Pinecone Vector Database). Upload your own custom knowledge base files (PDF, txt, epub, etc) using a simple React frontend.
after increasing MAX_FILE_SIZE and MAX_TOTAL_UPLOAD_SIZE in fileupload.go ,
I uploaded an epub and after a while ran into this error.
The book is 45mb.
^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c^[[?1;0c2023/04/20 09:01:19 [UploadHandler ERR] Error getting embeddings: error, status code: 400, message: This model's maximum context length is 8191 tokens, however you requested 8230 tokens (8230 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
[negroni] Apr 20 08:54:39 | 200 | 6m40.613778887s
POST /upload
is there a way to increase the amount of tokens, or chunk them appropriately for larger files?
after increasing
MAX_FILE_SIZE
andMAX_TOTAL_UPLOAD_SIZE
infileupload.go
, I uploaded an epub and after a while ran into this error.The book is 45mb.
is there a way to increase the amount of tokens, or chunk them appropriately for larger files?