zilliztech / GPTCache

Semantic cache for LLMs. Fully integrated with LangChain and llama_index.
https://gptcache.readthedocs.io
MIT License
6.89k stars 480 forks source link

[Enhancement]: How gptcache can better adapt to openai 1.x #613

Open SimFG opened 4 months ago

SimFG commented 4 months ago

What would you like to be added?

Before openai 1.x, the interface was in the form of a static method of the class, such as openai.ChatCompletion.create. But in openai 1.x, objects are used, like:

from openai import OpenAI
client = OpenAI(
    # Defaults to os.environ.get("OPENAI_API_KEY")
)

chat_completion = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello world"}]
)

So now there is no way to use the previous method of simply replacing the package name to achieve seamless access to gptcache. At present, the way I can think of is to proxy the relevant interfaces of openai through methods, such as:

def cache_openai_chat_complete(client: OpenAI, **openai_kwargs: Any):
    pass

Why is this needed?

No response

Anything else?

No response

SimFG commented 3 months ago

If someone has better ideas, suggestions are welcome. I have opened the pr: https://github.com/zilliztech/GPTCache/pull/614. I don’t merge the pr and bump the new version. I actually want to hear more people’s suggestions.