The OpenAIModel class currently uses our own custom request logic for calling OpenAI & Azure OpenAI. This is fine for completions but streaming support is a bit more complicated so I'm re-writing the class to use the official OpenAI client. This will result in a few configuration options being depreciated and replaced with new options but I'm adding logic to properly map the deprecated config options.
The
OpenAIModel
class currently uses our own custom request logic for calling OpenAI & Azure OpenAI. This is fine for completions but streaming support is a bit more complicated so I'm re-writing the class to use the official OpenAI client. This will result in a few configuration options being depreciated and replaced with new options but I'm adding logic to properly map the deprecated config options.