GreyDGL / PentestGPT

A GPT-empowered penetration testing tool
MIT License
7.29k stars 886 forks source link

Exception: This model's maximum context length is 4097 tokens. However, your messages resulted in 4581 tokens. \reduce the length of the messages #132

Closed hamzahjazi closed 1 year ago

hamzahjazi commented 1 year ago

I've performed a simple pentest using the PentestGPT but after the third step, an error shows says that the maximum context length is 4097 and my message is 4581 tokens. how can I increase the Tokens so I can continue the steps of a pentest? or is there a way to solve this problem?

123

GreyDGL commented 1 year ago

This will be solved in the next commit. Pushing it to the repo within these two days. Meanwhile, you may set the GPT-3.5 session to 16k context to solve it (but it is slightly more expensive): https://openai.com/pricing