Closed DeanXiao20 closed 1 year ago
@DeanXiao20 please check out the model limits set by OpenAI here. The extension doesn't have a way to increase that limit. Please note that you can update these max tokens in the extension settings chatgpt.gpt3.maxTokens
which is by default 1024
. Please also note that the maxToken
value is shared between your request and response and your question's total token count + response's total token count cannot exceed the limit set by the model.
My suggestion is that when the response is cut off due to length, you can ask it to continue responding with Continue
prompt.
I want to write 5000 char article, but response only relay part of the content, not full content. Can I change the response char limit and show full content in vscode chatgpt?