cequence-io / openai-scala-client

Scala client for OpenAI API
MIT License
191 stars 20 forks source link

Any plans to support caching for either Anthropic or VertexAI? #85

Open antonio-veezoo opened 3 weeks ago

antonio-veezoo commented 3 weeks ago

Does the library currently support caching for these providers in any way or do you have plans to add support?

Thanks for any info!

peterbanda commented 2 weeks ago

Hey @antonio-veezoo ,

could you elaborate a bit? What kind of caching do you have in mind - prompt, context, etc

antonio-veezoo commented 2 weeks ago

I'm thinking of the functionality described here for Claude https://www.anthropic.com/news/prompt-caching, and here for gemini https://cloud.google.com/vertex-ai/generative-ai/docs/context-cache/context-cache-overview

peterbanda commented 2 weeks ago

ok, we can do the prompt caching for Anthropic models (was on our list anyway)