Closed HyunggyuJang closed 11 months ago
CodeRabbit
The changes introduced in this pull request primarily focus on extending the functionality of the TokenLimits
class. This is achieved by adding support for a new model, "gpt-3.5-turbo-16k". The token limits for this model have been set to ensure optimal performance and compliance with the model's specifications.
File | Summary |
---|---|
src/limits.ts |
The TokenLimits class has been updated to include support for the "gpt-3.5-turbo-16k" model. The maximum tokens limit has been set to 16300 and the response tokens limit to 3000 for this model. |
TokenLimits
is only the place needed to be modified. Have set the token limits accordingly. Closes #406Summary by CodeRabbit
New Feature:
TokenLimits
class.maxTokens
limit to 16300 and theresponseTokens
limit to 3000 for the new model.