biocypher / biochatter

Backend library for conversational AI in biomedicine
http://biochatter.org/
MIT License
68 stars 22 forks source link

[Bug] OpenAI proxy usage support. #202

Closed winternewt closed 1 month ago

winternewt commented 1 month ago

Describe the bug I set up LiteLLM as a caching proxy for OAI calls. open ai open_api_base support

To Reproduce Set OPENAI_API_BASE env value to custom proxy address Create GptConversation object, call set_api_key Function call fails on client.models.list() call which accesses default OpenAI URL

Stack trace 401 auth error due to usage of default OpenAI base URL, as a result GptConversation .chat attribute is not initialized

Expected behavior add api_base hook

Additional context Minor change required: def set_api_key(self, api_key: str, user: str, api_base: str = None) -> bool: client = openai.OpenAI( api_key=api_key, api_base=api_base, )

winternewt commented 1 month ago

Or rather, not to touch base class declaration, add to init along with model: https://github.com/biocypher/biochatter/pull/203