This PR aims to begin the work needed to support the new gpt-4-1106-preview GPT-4 Turbo preview model. The PR in this current state is incomplete since it will first require the js-tiktoken dependency to support that model. That work within js-tiktoken is underway and can be tracked here: https://github.com/dqbd/tiktoken/pull/79
Once that is complete, then upgrading the dependency should resolve the type error in getEncodingForModelCached which currently fails due to gpt-4-1106-preview not being a supported model type.
I was unsure about where to find the values for tokens_per_message and tokens_per_name, so I left those the same as their GPT-4 counterparts.
This PR aims to begin the work needed to support the new
gpt-4-1106-preview
GPT-4 Turbo preview model. The PR in this current state is incomplete since it will first require the js-tiktoken dependency to support that model. That work within js-tiktoken is underway and can be tracked here: https://github.com/dqbd/tiktoken/pull/79Once that is complete, then upgrading the dependency should resolve the type error in
getEncodingForModelCached
which currently fails due togpt-4-1106-preview
not being a supported model type.I was unsure about where to find the values for
tokens_per_message
andtokens_per_name
, so I left those the same as their GPT-4 counterparts.