Closed IAPython closed 1 year ago
@IAPython unfortunately, the token limits are set by OpenAI and there isn't anything this extension do to bypass that. Though, once the response is cut off due to length by the APIs, you could then ask it to continue with this prompt: Continue
. I hope that answers your question.
Despite continuing to search for a reasonable response or configuration, it is still not possible to control the cutoff event of ChatGPT responses despite using the API. In a certain length the response is simply cut off and it is not a matter of the IDE but the same behavior that exists on the ChatGPT web
Thank you
IAPython