Closed Bazingabc closed 1 year ago
OpenAI has increased the maximum token length supported in gpt-3.5-turbo to 16k. May I ask if it's possible to update the ModelType enumeration and relevant configurations accordingly?
The new 16k gpt3.5 model is now part of the ModelType enum :slightly_smiling_face: It was just released in version 0.6.0
OpenAI has increased the maximum token length supported in gpt-3.5-turbo to 16k. May I ask if it's possible to update the ModelType enumeration and relevant configurations accordingly?