langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
91.32k stars 14.53k forks source link

Caching for ChatAnthropic is not working as expected #19328

Open chtheiss opened 5 months ago

chtheiss commented 5 months ago

Checked other resources

Example Code

from langchain.cache import SQLiteCache
from langchain.globals import set_llm_cache
from langchain_anthropic import ChatAnthropic
from langchain_core.messages import HumanMessage

set_llm_cache(SQLiteCache(database_path=".langchain_cache.db"))

chat_model = ChatAnthropic(
        model="claude-3-sonnet-20240229",
        temperature=1.0,
        max_tokens=2048,
    )

message = HumanMessage(content="Hello World!")
print(response)

Error Message and Stack Trace (if applicable)

No response

Description

The caching is only dependent on the messages and not on the parameters given to the ChatAnthropic class.

This results in langchain hitting the cache instead of sending a new requests to the API even so parameters like temperature, max_tokens or even the model have been changed.

I.e. when the first request containg just the message "Hello World" was send to "claude-3-sonnet-20240229" and one changes the model to "claude-3-opus-20240229" afterward langchain will still fetch the response for from the first request.

System Info

System Information

OS: Linux OS Version: #25~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Tue Feb 20 16:09:15 UTC 2 Python Version: 3.10.13 (main, Sep 11 2023, 13:44:35) [GCC 11.2.0]

Package Information

langchain_core: 0.1.32 langchain: 0.1.12 langchain_community: 0.0.28 langsmith: 0.1.29 langchain_anthropic: 0.1.4 langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

liugddx commented 5 months ago

Let me see

liugddx commented 5 months ago

Upgrade langchai_core and try again.

Alchemication commented 4 months ago

@liugddx I've just checked after upgrading langchain_core and langchain-anthropic and still facing same issue. After changing the model from Haiku to Opus, it is still reading from cache.

My env:

langchain==0.1.13
langchain-anthropic==0.1.8
langchain-community==0.0.29
langchain-core==0.1.42
langchain-google-genai==0.0.8
langchain-openai==0.1.1
langchain-text-splitters==0.0.1
langsmith==0.1.31
Thomas-Gentilhomme commented 2 months ago

Hi @chtheiss @Alchemication ! I am facing the same issue, have you been able to find a fix ? Many thanks !