meta-llama / codellama

Inference code for CodeLlama models
Other
16.06k stars 1.88k forks source link

Questions about exceeding Max_Token #242

Open Yindy07 opened 2 months ago

Yindy07 commented 2 months ago

When inferring infilling, the input token is over max_token, is there a good way to work around this?