weixi-feng / LayoutGPT

Official repo for LayoutGPT
MIT License
300 stars 20 forks source link

About token's length #9

Closed shanqiiu closed 1 year ago

shanqiiu commented 1 year ago

When I run the scriptrun_layoutgpt_3d.py, the following error occurs: This model's maximum context length is 4097 tokens. However, your messages resulted in 4109 tokens. Please reduce the length of the messages.Input too long. Will shrink the prompting examples

weixi-feng commented 1 year ago

If you are running on gpt3.5 or gpt3.5-chat, please set --gpt_input_length_limit 3000. For living rooms, please reduce the number of in-context demos to 4 by setting --K 4.