When the api_key is passed in additional kwargs to the litellm completion, it caused an issue with Azure OpenAI.
To fix it, it is passed as an arg to the OpenAI instantiation, and removed from additional kwargs in the handler.
I don't know if this might cause issues for other LiteLLM backends though.
But if it does, it probably indicates that a refactor is needed in order handle multiple backends
When the api_key is passed in additional kwargs to the litellm completion, it caused an issue with Azure OpenAI. To fix it, it is passed as an arg to the OpenAI instantiation, and removed from additional kwargs in the handler.
I don't know if this might cause issues for other LiteLLM backends though. But if it does, it probably indicates that a refactor is needed in order handle multiple backends