mshumer / gpt-author

MIT License
2.48k stars 360 forks source link

InvalidRequestError: This model's maximum context length is 16385 tokens. However, your messages resulted in 16394 tokens. Please reduce the length of the messages. #2

Closed pleabargain closed 1 year ago

pleabargain commented 1 year ago

How do I set the tokens so that I don't get this error? I was at chapter 17 of 20 when the script died.

pogic commented 1 year ago

write less chapters, or upgrade the gpt-4 model you're using

pleabargain commented 1 year ago

write less chapters, or upgrade the gpt-4 model you're using

I don't have access to GPT4 :( I reduced the chapter number and I got an epub. Thanks.