Closed eallion closed 5 months ago
@eallion
Please use a model with a longer context length. Now the default model is gpt-3.5-turbo
that a context length is 4k, so set gpt-3.5-turbo-16k
to your config as following
$ aicommits config set model=gpt-3.5-turbo-16k
The max_tokens
option only affects generation. It cannnot limit the input tokens.
https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens
The maximum number of tokens that can be generated in the chat completion.
Thanks!
Bug description
aicommits version
1.11.0
Environment
Can you contribute a fix?