Describe the bug
I set up LiteLLM as a caching proxy for OAI calls.
open ai open_api_base support
To Reproduce
Set OPENAI_API_BASE env value to custom proxy address
Create GptConversation object, call set_api_key
Function call fails on client.models.list() call which accesses default OpenAI URL
Stack trace
401 auth error due to usage of default OpenAI base URL, as a result GptConversation .chat attribute is not initialized
Describe the bug I set up LiteLLM as a caching proxy for OAI calls. open ai open_api_base support
To Reproduce Set OPENAI_API_BASE env value to custom proxy address Create GptConversation object, call set_api_key Function call fails on client.models.list() call which accesses default OpenAI URL
Stack trace 401 auth error due to usage of default OpenAI base URL, as a result GptConversation .chat attribute is not initialized
Expected behavior add api_base hook
Additional context Minor change required: def set_api_key(self, api_key: str, user: str, api_base: str = None) -> bool: client = openai.OpenAI( api_key=api_key, api_base=api_base, )