zilliztech / GPTCache

Semantic cache for LLMs. Fully integrated with LangChain and llama_index.
https://gptcache.readthedocs.io
MIT License
6.89k stars 480 forks source link

[Feature]: Support for Azure OpenAI #568

Open amrit2cisco opened 7 months ago

amrit2cisco commented 7 months ago

Is your feature request related to a problem? Please describe.

We are currently using Azure OpenAI to make API calls, seems that GPTCache only supports using the standard OpenAI API keys and it isn't possible to route the requests through Azure

Describe the solution you'd like.

OpenAI's library allows us to specify to route the requests through Azure with the following config:

openai.api_type = "azure"
openai.api_version = "2023-05-15"
openai.api_key = "<api key>"
openai.api_base = "https://<tenant>.openai.azure.com/"
model = "gpt-35-turbo"

Including support to do this through the GPTCache OpenAI adapter would be the way to replicate the same for GPTCache

Describe an alternate solution.

N/A

Anything else? (Additional Context)

No response

SimFG commented 7 months ago

@amrit2cisco Do you mean to also introduce these constants in the openai file under gptcache?

amrit2cisco commented 7 months ago

@SimFG yes, and particularly the Azure OpenAI API key (which is fundamentally different from the vanilla OpenAI API Key) should be able to be used to route the requests to api_base in the absence of a standard OpenAI API key