When I run the scriptrun_layoutgpt_3d.py, the following error occurs:
This model's maximum context length is 4097 tokens. However, your messages resulted in 4109 tokens. Please reduce the length of the messages.Input too long. Will shrink the prompting examples
If you are running on gpt3.5 or gpt3.5-chat, please set --gpt_input_length_limit 3000. For living rooms, please reduce the number of in-context demos to 4 by setting --K 4.
When I run the script
run_layoutgpt_3d.py
, the following error occurs:This model's maximum context length is 4097 tokens. However, your messages resulted in 4109 tokens. Please reduce the length of the messages.Input too long. Will shrink the prompting examples