transitive-bullshit / agentic

AI agent stdlib that works with any LLM and TypeScript AI SDK.
https://agentic.so
MIT License
16.26k stars 2.12k forks source link

Caching of the OpenAI API responses #517

Closed markNZed closed 1 year ago

markNZed commented 1 year ago

Describe the feature

During development it could be useful to cache OpenAI API responses while keeping behaviors like the incremental returning of results. This might be a proxy of the OpenAI API and maybe another project, like https://github.com/easychen/openai-api-proxy

This can be done at the level of the application using chatgpt-api but things like streaming add some complications.

transitive-bullshit commented 1 year ago

great idea 💯

here's some related notes I had on this.

OpenAI proxy API

COSS / hosted proxy for interacting with the OpenAI API and/or other main AI APIs

transitive-bullshit commented 1 year ago

I'm going to close this issue as out of scope for this repo, but I hope my notes above are useful for anyone looking to add this into their workflow – or anyone who wants to build this type of caching abstraction 🔥

thanks @markNZed 🙏