chakkaradeep / pyCodeAGI

177 stars 28 forks source link

Max token length exceeded error #1

Open salvatoto opened 1 year ago

salvatoto commented 1 year ago

Running pycodeagi.py, got a max token length exceeded error at the "APP CODE" step:

Traceback (most recent call last):
  File "pycodeagi.py", line 165, in <module>
    pycode_agi({"objective": objective})

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4170 tokens (1170 in your prompt; 3000 for the completion). Please reduce your prompt; or completion length.

chakkaradeep commented 1 year ago

The GPT3 version can only use 4097 tokens, and unfortunately, there is no way to optimize or increase the token usage. The GPT4 version has higher token limits and can give you better results.

chakkaradeep commented 1 year ago

Curios to know what did you ask it to build? I can try with GPT4 version if you do not have access to GPT4.