langchain-ai / chat-langchain

https://chat.langchain.com
MIT License
5.02k stars 1.17k forks source link

llm_cache no longer supported from root module #198

Open AndyMik90 opened 9 months ago

AndyMik90 commented 9 months ago

Got the code up and running locally, but getting warning:

"/venv/lib/python3.11/site-packages/langchain/__init__.py:34: UserWarning: Importing llm_cache from langchain root module is no longer supported. Please use langchain.globals.set_llm_cache() / langchain.globals.get_llm_cache() instead.
  warnings.warn("

But llm_cache is not used directly in the main.py

BlackSiao commented 4 months ago

same issue, did you solve that?