askrella / whatsapp-chatgpt

ChatGPT + DALL-E + WhatsApp = AI Assistant :rocket: :robot:
3.47k stars 835 forks source link

This model's maximum context length is 4096 tokens #125

Closed thenetguy closed 1 year ago

thenetguy commented 1 year ago

not sure what im doing wrong but I keep getting the below error when trying to use !gpt , even tho I set the env MAX_MODEL_TOKENS: "4096" An error occured, please contact the administrator. (This model's maximum context length is 4096 tokens. However, you requested 4250 tokens (154 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.) any suggestions will be greatly appreciated

Re1da commented 1 year ago

Duplicate of #

thenetguy commented 1 year ago

I'm sorry I don't know what you mean, can you please elaborate?

navopw commented 1 year ago

The prompt you are putting in also costs tokens and reduces the limit.

Try a different value, for example:

MAX_MODEL_TOKENS=3000
thenetguy commented 1 year ago

this worked . thank you sir

bogdanr commented 1 year ago

It looks like sometimes the message gets overflow with other things than the current message:

whatsapp-chatgpt-whatsapp-chatgpt-1  | ◇  [GPT] Received prompt from 40723372866@c.us: hello
whatsapp-chatgpt-whatsapp-chatgpt-1  | An error occured Error: This model's maximum context length is 4096 tokens. However, you requested 4231 tokens (1231 in the messages, 3000 in the completion). Please reduce the length of the messages or completion.
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at ChatGPT.askStream (file:///app/node_modules/chatgpt-official/src/classes/chatgpt.ts:152:11)
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at ChatGPT.ask (file:///app/node_modules/chatgpt-official/src/classes/chatgpt.ts:97:10)
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at Module.handleMessageGPT (/app/src/handlers/gpt.ts:27:15)
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at Module.handleIncomingMessage (/app/src/handlers/message.ts:119:3)
whatsapp-chatgpt-whatsapp-chatgpt-1  |     at Client.<anonymous> (/app/src/index.ts:84:3)

Here the message is just hello

rivanbello commented 1 year ago

Same problem here.

navopw commented 1 year ago

Thats really strange. haven't encountered this before. Can you give me the prompt so I can reproduce this error?

kingofkillers91 commented 1 year ago

Same error: [GPT] Received prompt from 17869360334@c.us: hey whatsapp-chatgpt_1 | ◇ [GPT] New conversation for 17869360334@c.us (ID: 55d53b60-2b31-4613-b593-890a1dcfcf55) whatsapp-chatgpt_1 | An error occured Error: This model's maximum context length is 4097 tokens. However, you requested 4250 tokens (154 in the messages, 4096 in the completion). Please reduce the length of the messages or completion. whatsapp-chatgpt_1 | at ChatGPT.askStream (file:///app/node_modules/chatgpt-official/src/classes/chatgpt.ts:152:11) whatsapp-chatgpt_1 | at processTicksAndRejections (node:internal/process/task_queues:95:5) whatsapp-chatgpt_1 | at ChatGPT.ask (file:///app/node_modules/chatgpt-official/src/classes/chatgpt.ts:97:10) whatsapp-chatgpt_1 | at Module.handleMessageGPT (/app/src/handlers/gpt.ts:63:15) whatsapp-chatgpt_1 | at Module.handleIncomingMessage (/app/src/handlers/message.ts:143:3) whatsapp-chatgpt_1 | at Client. (/app/src/index.ts:84:3) I'm running the bot on docker. and i have no idea where i should edit set the max tokens cause there isn't any variable to set that in the docker-compose.yml