langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.29k stars 14.75k forks source link

Azure Cosmos DB Semantic Cache and Redis Semantic Cache not working as expected when the prompt is different #25161

Open AjayGanti opened 1 month ago

AjayGanti commented 1 month ago

Checked other resources

Example Code

I have attached two notebooks one for redis cache semantic search and another for Azure Cosmos db for Mongo db v Core for caching. semantic_caching.zip

Error Message and Stack Trace (if applicable)

No response

Description

I am trying to use the semantic caching feature of the langchain with Azure Cosmos db for Mongodb vCore for quicker response. I tried the same example that is given in the langchain documentation. In my code if i ask it "Tell me a joke" it is returning response from the cache in very less time. But when the question is changed to "What to do when bored?" i am expecting the langchain to hit the LLM instead of returning the response from the cache. But it is returning the same cached response for "Tell me a joke". I have attached the code and its output.

I have tried the same with the Redis Semantic caching and i see the same response.

System Info

System Information

OS: Linux OS Version: #1 SMP Fri Mar 29 23:14:13 UTC 2024 Python Version: 3.10.14 (main, May 6 2024, 19:42:50) [GCC 11.2.0]

Package Information

langchain_core: 0.2.28 langchain: 0.2.12 langchain_community: 0.2.11 langsmith: 0.1.98 langchain_openai: 0.1.20 langchain_text_splitters: 0.2.2

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

AjayGanti commented 3 weeks ago

Any updates on the above issue?