This PR adds a configurable environment variable for the model temperature, and support for the following new OpenAI models:
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k (defaults to 0613 below)
gpt-3.5-turbo-16k-0613
gpt-4-0613
gpt-4-32k-0613
If the model version is not specified, the context length and number of token calculations now assume 0613 instead of the previous 0301/0314. This is intended to be consistent with the announced behaviour of the OpenAI API to use the 0613 models as the default from 27th June 2023 if the version is not specified.
This PR adds a configurable environment variable for the model temperature, and support for the following new OpenAI models:
gpt-3.5-turbo-0613
gpt-3.5-turbo-16k
(defaults to 0613 below)gpt-3.5-turbo-16k-0613
gpt-4-0613
gpt-4-32k-0613
If the model version is not specified, the context length and number of token calculations now assume0613
instead of the previous0301
/0314
. This is intended to be consistent with the announced behaviour of the OpenAI API to use the0613
models as the default from 27th June 2023 if the version is not specified.