Open urbanscribe opened 10 months ago
This sounds like another dependency might be overwriting your installed version of langchain
with a different version.
Next time you encounter this error, before you do the force-reinstall step, could you run a few commands and report their output here?
pip freeze | grep langchain
pip install pipdeptree && pipdeptree --reverse
This will help us figure out why the langchain
version is being overwritten and by which other package.
for sure here you go
(venv) user@Mac-Studio newfilesystem % pip freeze | grep langchain
langchain==0.0.275
(venv) user@Mac-Studio newfilesystem % pip install pipdeptree && pipdeptree --reverse
Collecting pipdeptree
Downloading pipdeptree-2.13.0-py3-none-any.whl (26 kB)
Installing collected packages: pipdeptree
Successfully installed pipdeptree-2.13.0
WARNING: You are using pip version 22.0.4; however, version 23.2.1 is available.
You should consider upgrading via the '/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/bin/python -m pip install --upgrade pip' command.
annotated-types==0.5.0
async-timeout==4.0.3
βββ aiohttp==3.8.5 [requires: async-timeout>=4.0.0a3,<5.0]
β βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β βββ openai==0.27.9 [requires: aiohttp]
βββ langchain==0.0.275 [requires: async-timeout>=4.0.0,<5.0.0]
attrs==23.1.0
βββ aiohttp==3.8.5 [requires: attrs>=17.3.0]
β βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β βββ openai==0.27.9 [requires: aiohttp]
βββ jsonschema==4.19.0 [requires: attrs>=22.2.0]
β βββ altair==5.0.1 [requires: jsonschema>=3.0]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ referencing==0.30.2 [requires: attrs>=22.2.0]
βββ jsonschema==4.19.0 [requires: referencing>=0.28.4]
β βββ altair==5.0.1 [requires: jsonschema>=3.0]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ jsonschema-specifications==2023.7.1 [requires: referencing>=0.28.0]
βββ jsonschema==4.19.0 [requires: jsonschema-specifications>=2023.03.6]
βββ altair==5.0.1 [requires: jsonschema>=3.0]
βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
backoff==2.2.1
βββ posthog==3.0.2 [requires: backoff>=1.10.0]
βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
bcrypt==4.0.1
βββ chromadb==0.4.7 [requires: bcrypt>=4.0.1]
blinker==1.6.2
βββ streamlit==1.26.0 [requires: blinker>=1.0.0,<2]
cachetools==5.3.1
βββ streamlit==1.26.0 [requires: cachetools>=4.0,<6]
certifi==2023.7.22
βββ pulsar-client==3.2.0 [requires: certifi]
β βββ chromadb==0.4.7 [requires: pulsar-client>=3.1.0]
βββ requests==2.31.0 [requires: certifi>=2017.4.17]
βββ chromadb==0.4.7 [requires: requests>=2.28]
βββ langchain==0.0.275 [requires: requests>=2,<3]
βββ langsmith==0.0.27 [requires: requests>=2,<3]
β βββ langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
βββ openai==0.27.9 [requires: requests>=2.20]
βββ posthog==3.0.2 [requires: requests>=2.7,<3.0]
β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
βββ streamlit==1.26.0 [requires: requests>=2.18,<3]
βββ tiktoken==0.4.0 [requires: requests>=2.26.0]
charset-normalizer==3.2.0
βββ aiohttp==3.8.5 [requires: charset-normalizer>=2.0,<4.0]
β βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β βββ openai==0.27.9 [requires: aiohttp]
βββ requests==2.31.0 [requires: charset-normalizer>=2,<4]
βββ chromadb==0.4.7 [requires: requests>=2.28]
βββ langchain==0.0.275 [requires: requests>=2,<3]
βββ langsmith==0.0.27 [requires: requests>=2,<3]
β βββ langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
βββ openai==0.27.9 [requires: requests>=2.20]
βββ posthog==3.0.2 [requires: requests>=2.7,<3.0]
β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
βββ streamlit==1.26.0 [requires: requests>=2.18,<3]
βββ tiktoken==0.4.0 [requires: requests>=2.26.0]
click==8.1.7
βββ streamlit==1.26.0 [requires: click>=7.0,<9]
βββ uvicorn==0.23.2 [requires: click>=7.0]
βββ chromadb==0.4.7 [requires: uvicorn>=0.18.3]
colorlog==6.7.0
docx2txt==0.8
exceptiongroup==1.1.3
βββ anyio==3.7.1 [requires: exceptiongroup]
βββ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
β βββ fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
βββ watchfiles==0.20.0 [requires: anyio>=3.0.0]
flatbuffers==23.5.26
βββ onnxruntime==1.15.1 [requires: flatbuffers]
βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
frozenlist==1.4.0
βββ aiohttp==3.8.5 [requires: frozenlist>=1.1.1]
β βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β βββ openai==0.27.9 [requires: aiohttp]
βββ aiosignal==1.3.1 [requires: frozenlist>=1.1.0]
βββ aiohttp==3.8.5 [requires: aiosignal>=1.1.2]
βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
βββ openai==0.27.9 [requires: aiohttp]
greenlet==2.0.2
βββ SQLAlchemy==2.0.20 [requires: greenlet!=0.4.17]
βββ langchain==0.0.275 [requires: SQLAlchemy>=1.4,<3]
h11==0.14.0
βββ uvicorn==0.23.2 [requires: h11>=0.8]
βββ chromadb==0.4.7 [requires: uvicorn>=0.18.3]
httptools==0.6.0
humanfriendly==10.0
βββ coloredlogs==15.0.1 [requires: humanfriendly>=9.1]
βββ onnxruntime==1.15.1 [requires: coloredlogs]
βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
idna==3.4
βββ anyio==3.7.1 [requires: idna>=2.8]
β βββ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
β β βββ fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β βββ watchfiles==0.20.0 [requires: anyio>=3.0.0]
βββ requests==2.31.0 [requires: idna>=2.5,<4]
β βββ chromadb==0.4.7 [requires: requests>=2.28]
β βββ langchain==0.0.275 [requires: requests>=2,<3]
β βββ langsmith==0.0.27 [requires: requests>=2,<3]
β β βββ langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
β βββ openai==0.27.9 [requires: requests>=2.20]
β βββ posthog==3.0.2 [requires: requests>=2.7,<3.0]
β β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
β βββ streamlit==1.26.0 [requires: requests>=2.18,<3]
β βββ tiktoken==0.4.0 [requires: requests>=2.26.0]
βββ yarl==1.9.2 [requires: idna>=2.0]
βββ aiohttp==3.8.5 [requires: yarl>=1.0,<2.0]
βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
βββ openai==0.27.9 [requires: aiohttp]
logger==1.4
MarkupSafe==2.1.3
βββ Jinja2==3.1.2 [requires: MarkupSafe>=2.0]
βββ altair==5.0.1 [requires: Jinja2]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ pydeck==0.8.0 [requires: Jinja2>=2.10.1]
βββ streamlit==1.26.0 [requires: pydeck>=0.8,<1]
mdurl==0.1.2
βββ markdown-it-py==3.0.0 [requires: mdurl~=0.1]
βββ rich==13.5.2 [requires: markdown-it-py>=2.2.0]
βββ streamlit==1.26.0 [requires: rich>=10.14.0,<14]
monotonic==1.6
βββ posthog==3.0.2 [requires: monotonic>=1.5]
βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
mpmath==1.3.0
βββ sympy==1.12 [requires: mpmath>=0.19]
βββ onnxruntime==1.15.1 [requires: sympy]
βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
multidict==6.0.4
βββ aiohttp==3.8.5 [requires: multidict>=4.5,<7.0]
β βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β βββ openai==0.27.9 [requires: aiohttp]
βββ yarl==1.9.2 [requires: multidict>=4.0]
βββ aiohttp==3.8.5 [requires: yarl>=1.0,<2.0]
βββ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
βββ openai==0.27.9 [requires: aiohttp]
mypy-extensions==1.0.0
βββ typing-inspect==0.9.0 [requires: mypy-extensions>=0.3.0]
βββ dataclasses-json==0.5.14 [requires: typing-inspect>=0.4.0,<1]
βββ langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
numpy==1.25.2
βββ altair==5.0.1 [requires: numpy]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ chroma-hnswlib==0.7.2 [requires: numpy]
β βββ chromadb==0.4.7 [requires: chroma-hnswlib==0.7.2]
βββ chromadb==0.4.7 [requires: numpy>=1.21.6]
βββ langchain==0.0.275 [requires: numpy>=1,<2]
βββ numexpr==2.8.5 [requires: numpy>=1.13.3]
β βββ langchain==0.0.275 [requires: numexpr>=2.8.4,<3.0.0]
βββ onnxruntime==1.15.1 [requires: numpy>=1.21.6]
β βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
βββ pandas==2.0.3 [requires: numpy>=1.20.3]
β βββ altair==5.0.1 [requires: pandas>=0.18]
β β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
β βββ streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
βββ pyarrow==13.0.0 [requires: numpy>=1.16.6]
β βββ streamlit==1.26.0 [requires: pyarrow>=6.0]
βββ pydeck==0.8.0 [requires: numpy>=1.16.4]
β βββ streamlit==1.26.0 [requires: pydeck>=0.8,<1]
βββ streamlit==1.26.0 [requires: numpy>=1.19.3,<2]
overrides==7.4.0
βββ chromadb==0.4.7 [requires: overrides>=7.3.1]
packaging==23.1
βββ marshmallow==3.20.1 [requires: packaging>=17.0]
β βββ dataclasses-json==0.5.14 [requires: marshmallow>=3.18.0,<4.0.0]
β βββ langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
βββ onnxruntime==1.15.1 [requires: packaging]
β βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
βββ streamlit==1.26.0 [requires: packaging>=16.8,<24]
Pillow==9.5.0
βββ streamlit==1.26.0 [requires: Pillow>=7.1.0,<10]
pip==22.0.4
pipdeptree==2.13.0
protobuf==4.24.2
βββ onnxruntime==1.15.1 [requires: protobuf]
β βββ chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
βββ streamlit==1.26.0 [requires: protobuf>=3.20,<5]
Pygments==2.16.1
βββ rich==13.5.2 [requires: Pygments>=2.13.0,<3.0.0]
βββ streamlit==1.26.0 [requires: rich>=10.14.0,<14]
Pympler==1.0.1
βββ streamlit==1.26.0 [requires: Pympler>=0.9,<2]
PyPika==0.48.9
βββ chromadb==0.4.7 [requires: PyPika>=0.48.9]
python-dotenv==1.0.0
pytz==2023.3
βββ pandas==2.0.3 [requires: pytz>=2020.1]
βββ altair==5.0.1 [requires: pandas>=0.18]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
PyYAML==6.0.1
βββ langchain==0.0.275 [requires: PyYAML>=5.3]
regex==2023.8.8
βββ tiktoken==0.4.0 [requires: regex>=2022.1.18]
rpds-py==0.10.0
βββ jsonschema==4.19.0 [requires: rpds-py>=0.7.1]
β βββ altair==5.0.1 [requires: jsonschema>=3.0]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ referencing==0.30.2 [requires: rpds-py>=0.7.0]
βββ jsonschema==4.19.0 [requires: referencing>=0.28.4]
β βββ altair==5.0.1 [requires: jsonschema>=3.0]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ jsonschema-specifications==2023.7.1 [requires: referencing>=0.28.0]
βββ jsonschema==4.19.0 [requires: jsonschema-specifications>=2023.03.6]
βββ altair==5.0.1 [requires: jsonschema>=3.0]
βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
setuptools==58.1.0
six==1.16.0
βββ posthog==3.0.2 [requires: six>=1.5]
β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
βββ python-dateutil==2.8.2 [requires: six>=1.5]
βββ pandas==2.0.3 [requires: python-dateutil>=2.8.2]
β βββ altair==5.0.1 [requires: pandas>=0.18]
β β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
β βββ streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
βββ posthog==3.0.2 [requires: python-dateutil>2.1]
β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
βββ streamlit==1.26.0 [requires: python-dateutil>=2.7.3,<3]
smmap==5.0.0
βββ gitdb==4.0.10 [requires: smmap>=3.0.1,<6]
βββ GitPython==3.1.32 [requires: gitdb>=4.0.1,<5]
βββ streamlit==1.26.0 [requires: GitPython>=3.0.7,<4,!=3.1.19]
sniffio==1.3.0
βββ anyio==3.7.1 [requires: sniffio>=1.1]
βββ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
β βββ fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
βββ watchfiles==0.20.0 [requires: anyio>=3.0.0]
tenacity==8.2.3
βββ langchain==0.0.275 [requires: tenacity>=8.1.0,<9.0.0]
βββ streamlit==1.26.0 [requires: tenacity>=8.1.0,<9]
tokenizers==0.13.3
βββ chromadb==0.4.7 [requires: tokenizers>=0.13.2]
toml==0.10.2
βββ streamlit==1.26.0 [requires: toml>=0.10.1,<2]
toolz==0.12.0
βββ altair==5.0.1 [requires: toolz]
βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
tornado==6.3.3
βββ streamlit==1.26.0 [requires: tornado>=6.0.3,<7]
tqdm==4.66.1
βββ chromadb==0.4.7 [requires: tqdm>=4.65.0]
βββ openai==0.27.9 [requires: tqdm]
typing-extensions==4.7.1
βββ altair==5.0.1 [requires: typing-extensions>=4.0.1]
β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
βββ chromadb==0.4.7 [requires: typing-extensions>=4.5.0]
βββ fastapi==0.99.1 [requires: typing-extensions>=4.5.0]
β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
βββ pydantic==1.10.12 [requires: typing-extensions>=4.2.0]
β βββ chromadb==0.4.7 [requires: pydantic>=1.9,<2.0]
β βββ fastapi==0.99.1 [requires: pydantic>=1.7.4,<2.0.0,!=1.8.1,!=1.8]
β β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β βββ langchain==0.0.275 [requires: pydantic>=1,<3]
β βββ langsmith==0.0.27 [requires: pydantic>=1,<3]
β βββ langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
βββ pydantic-core==2.6.3 [requires: typing-extensions>=4.6.0,!=4.7.0]
βββ pypdf==3.15.4 [requires: typing-extensions>=3.7.4.3]
βββ SQLAlchemy==2.0.20 [requires: typing-extensions>=4.2.0]
β βββ langchain==0.0.275 [requires: SQLAlchemy>=1.4,<3]
βββ starlette==0.27.0 [requires: typing-extensions>=3.10.0]
β βββ fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β βββ chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
βββ streamlit==1.26.0 [requires: typing-extensions>=4.1.0,<5]
βββ typing-inspect==0.9.0 [requires: typing-extensions>=3.7.4]
β βββ dataclasses-json==0.5.14 [requires: typing-inspect>=0.4.0,<1]
β βββ langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
βββ uvicorn==0.23.2 [requires: typing-extensions>=4.0]
βββ chromadb==0.4.7 [requires: uvicorn>=0.18.3]
tzdata==2023.3
βββ pandas==2.0.3 [requires: tzdata>=2022.1]
β βββ altair==5.0.1 [requires: pandas>=0.18]
β β βββ streamlit==1.26.0 [requires: altair>=4.0,<6]
β βββ streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
βββ pytz-deprecation-shim==0.1.0.post0 [requires: tzdata]
βββ tzlocal==4.3.1 [requires: pytz-deprecation-shim]
βββ streamlit==1.26.0 [requires: tzlocal>=1.1,<5]
urllib3==2.0.4
βββ requests==2.31.0 [requires: urllib3>=1.21.1,<3]
βββ chromadb==0.4.7 [requires: requests>=2.28]
βββ langchain==0.0.275 [requires: requests>=2,<3]
βββ langsmith==0.0.27 [requires: requests>=2,<3]
β βββ langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
βββ openai==0.27.9 [requires: requests>=2.20]
βββ posthog==3.0.2 [requires: requests>=2.7,<3.0]
β βββ chromadb==0.4.7 [requires: posthog>=2.4.0]
βββ streamlit==1.26.0 [requires: requests>=2.18,<3]
βββ tiktoken==0.4.0 [requires: requests>=2.26.0]
uvloop==0.17.0
validators==0.21.2
βββ streamlit==1.26.0 [requires: validators>=0.2,<1]
websockets==11.0.3
zipp==3.16.2
βββ importlib-metadata==6.8.0 [requires: zipp>=0.5]
β βββ streamlit==1.26.0 [requires: importlib-metadata>=1.4,<7]
βββ importlib-resources==6.0.1 [requires: zipp>=3.1.0]
βββ chromadb==0.4.7 [requires: importlib-resources]
Very unusual! Thanks for sending that over.
Would you mind adding a few lines of code just before the crashing line in your app and posting what those lines print out when the app crashes?
Add the new code right before this line in your app:
File "/Users/user/Developer/newfilesystem/pages/chat.py", line 104, in <module>
llm = ChatOpenAI(
Here are the lines to add:
import langchain
print("*** Running langchain version:", langchain.__version__)
print("*** langchain contents:", dir(langchain))
If your app does logging in a way such that print
statement output is not captured, feel free to amend the lines to redirect their output as needed to capture it right before the app crashes with the error you posted.
sure thing
here is the output
AttributeError: module 'langchain' has no attribute 'verbose' Running langchain version: 0.0.275 langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', 'all', 'annotations', 'builtins', 'cached', 'doc', 'file', 'loader', 'name', 'package', 'path', 'spec', 'version', 'cache', 'debug', 'document_loaders', 'document_transformers', 'embeddings', 'llm_cache', 'retrievers', 'vectorstores', 'verbose']
Something very very strange is happening here, and I'm sure you can see it too: the langchain contents
line lists 'verbose'
as one of the attributes that exists on langchain
, and yet the AttributeError
says module 'langchain' has no attribute 'verbose'
.
Could I get you to add one more printing line next to the ones you already added?
print("*** langchain verbose value:", getattr(langchain, "verbose", "<non existent>"))
Also, could you scan your application code to make sure you never do anything like del langchain["verbose"]
or something like that? It would be a strange thing to do, but we're in strange territory already so I have to ask π
*** langchain verbose value: False
pretty consistently
KeyError: 'langchain' Running langchain version: 0.0.276 langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', 'all', 'annotations', 'builtins', 'cached', 'doc', 'file', 'loader', 'name', 'package', 'path', 'spec', 'version', 'adapters', 'agents', 'base_language', 'cache', 'chains', 'chat_loaders', 'chat_models', 'debug', 'document_loaders', 'document_transformers', 'embeddings', 'graphs', 'llm_cache', 'llms', 'memory', 'output_parsers', 'prompts', 'requests', 'retrievers', 'text_splitter', 'tools', 'vectorstores', 'verbose'] *** langchain verbose value: False
Name: pydantic Version: 1.10.12 Summary: Data validation and settings management using python type hints Home-page: https://github.com/pydantic/pydantic Author: Samuel Colvin Author-email: s@muelcolvin.com License: MIT Location: /Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages Requires: typing-extensions Required-by: chainlit, chromadb, fastapi, langchain, langsmith, openapi-schema-pydantic, openbb, prisma (venv) alexfuchs@Mac-Studio newfilesystem % pip show langchain Name: langchain Version: 0.0.276 Summary: Building applications with LLMs through composability Home-page: https://github.com/langchain-ai/langchain Author: Author-email: License: MIT Location: /Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages Requires: aiohttp, async-timeout, dataclasses-json, langsmith, numexpr, numpy, pydantic, PyYAML, requests, SQLAlchemy, tenacity Required-by: llama-index, openbb
new intermittent
/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
exec(code, module.dict)
File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/pages/chatNEW.py", line 248, in
FYI pretty much have to run pip install --upgrade --force-reinstall --no-deps --no-cache-dir langchain every time when starting the streamlit app.
Is it possible that the app is running inside a different virtualenv than where you added the print statements? I think something like that is the only possible answer here because of the printout value you showed earlier:
*** langchain verbose value: False
This was obtained by reading langchain.verbose
, which is the value that the crashing code says doesn't exist. But it clearly exists with value False
here, so it has to be the case that the error is coming from some other process which is invoking langchain in a different (probably outdated) virtualenv.
Going back through your previous messages, it seems like there might indeed be two virtualenvs at different paths and with different versions: one in python 3.9 and one in python 3.10.
Could you look through your machine (and your previous messages here) and make sure to run the pip freeze
and pipdeptree
commands in both virtualenvs? Here are the full commands one more time, for your convenience:
pip freeze | grep langchain
pip install pipdeptree && pipdeptree --reverse
@obi1kenobi Hi, I also enconter this issue, I got verbose= true, but when I initialize an agent, I got AttributeError: module 'langchain' has no attribute 'verbose'
import langchain
print("*** Running langchain version:", langchain.__version__)
print("*** langchain contents:", dir(langchain))
*** Running langchain version: 0.0.239
*** langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'PALChain', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SQLDatabaseChain', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', '__all__', '__annotations__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__version__', 'agents', 'cache', 'debug', 'llm_cache', 'verbose']
print("*** langchain verbose value:", getattr(langchain, "verbose", "<non existent>"))
*** langchain verbose value: True
I use langchain V0.0.239, and the code is as following:
from langchain.llms import LlamaCpp
from langchain.agents import load_tools
from langchain.agents import initialize_agent
llms = LlamaCpp(model_path="/home/7B/ggml-model-f16.gguf")
tools = load_tools(["serpapi",], llm=llms)
agent = initialize_agent(tools, llms, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)
the error
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[41], line 1
----> 1 agent = initialize_agent(tools, llms, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/agents/initialize.py:57, in initialize_agent(tools, llm, agent, callback_manager, agent_path, agent_kwargs, tags, **kwargs)
55 agent_cls = AGENT_TO_CLASS[agent]
56 agent_kwargs = agent_kwargs or {}
---> 57 agent_obj = agent_cls.from_llm_and_tools(
58 llm, tools, callback_manager=callback_manager, **agent_kwargs
59 )
60 elif agent_path is not None:
61 agent_obj = load_agent(
62 agent_path, llm=llm, tools=tools, callback_manager=callback_manager
63 )
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/agents/mrkl/base.py:110, in ZeroShotAgent.from_llm_and_tools(cls, llm, tools, callback_manager, output_parser, prefix, suffix, format_instructions, input_variables, **kwargs)
102 cls._validate_tools(tools)
103 prompt = cls.create_prompt(
104 tools,
105 prefix=prefix,
(...)
108 input_variables=input_variables,
109 )
--> 110 llm_chain = LLMChain(
111 llm=llm,
112 prompt=prompt,
113 callback_manager=callback_manager,
114 )
115 tool_names = [tool.name for tool in tools]
116 _output_parser = output_parser or cls._get_default_output_parser()
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/load/serializable.py:74, in Serializable.__init__(self, **kwargs)
73 def __init__(self, **kwargs: Any) -> None:
---> 74 super().__init__(**kwargs)
75 self._lc_kwargs = kwargs
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/main.py:339, in pydantic.main.BaseModel.__init__()
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/main.py:1066, in pydantic.main.validate_model()
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/fields.py:439, in pydantic.fields.ModelField.get_default()
File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/chains/base.py:30, in _get_verbosity()
29 def _get_verbosity() -> bool:
---> 30 return langchain.verbose
AttributeError: module 'langchain' has no attribute 'verbose'
It's werid that I didn't change anything, and the code could run without error yesterday, but run with error when I reload the LLM model today.
This is very strange, and I haven't been able to replicate this on my end no matter what I try π
Any chance you can try to construct a minimal repro that triggers the problem on a clean installation of Python or anaconda, such as inside a Docker container? There's clearly some kind of problem here, but since it's only happening on your end, I'm not able to debug it or even figure out which combination of Python, anaconda, something around the model, or langchain itself is contributing to the issue popping up.
While I still don't know what's causing this, I opened #11311 recently with some refactoring that should make a few of the possible underlying causes go away. After that PR gets merged and released, I'd love your help with testing to see if it perhaps makes this issue go away π€
Sorry we haven't been able to get to the bottom of this yet, and thanks for bearing with us as we try to figure out what's going on!
I'm also getting module 'langchain' has no attribute 'debug'
it happens more often if there are two streamlit front ends open on the same backend. but it does happen when there is a single one.
can you try putting import langchain
at the top of your script? i think theres some funkiness with the way streamlit loads modules. if you make sure to load langchain fully (by just putting import langchain
at the top) that usually fixes it
I have a related issue, where using langchain 0.0.311, I always get the following error: AttributeError: type object 'APIOperation' has no attribute 'from_openapi_url' (although following the path in VS Code clearly shows the code for the attribute)
After removing langchain in my poetry virtualenv and installing specifically langchain=0.0.232, the error disappeared. I chose this version because I'm successfully using it in one of my other projects.
Same issue using langchain 0.1.4 on Windows 10 / CUDA 11.8 / Python 3.10.2.
Errors with attributes debug
, verbose
, and llm_cache
.
llm_cache
can't be bool
so I cannot just setting it to True or False.
https://github.com/langchain-ai/langchain/issues/2079#issuecomment-1487416187
Found the comment here and found that I'm not a smart man either.π
Renaming langchain.py
to main.py
fixed it!
@obi1kenobi the import langchain
in get_verbose
and get_debug
is pulling in the user's langchain
module.
β echo 'print("this is the wrong langchain")' > langchain.py
β poetry run python -c 'from langchain_core.globals import get_verbose; print(get_verbose())'
this is the wrong langchain
Traceback (most recent call last):
File "<string>", line 1, in <module>
File ".../lib/python3.12/site-packages/langchain_core/globals/__init__.py", line 72, in get_verbose
old_verbose = langchain.verbose
^^^^^^^^^^^^^^^^^
AttributeError: module 'langchain' has no attribute 'verbose'
this will also happen for get_llm_cache
.
the langchain_core/globals/__init__.py
code could except AttributeError
to protect from this. it could also raise a warning to the user that they're masking the langchain
module with their own.
i ran into this because i had a clone of langchain in my current working directory.
the
langchain_core/globals/__init__.py
code couldexcept AttributeError
to protect from this. it could also raise a warning to the user that they're masking thelangchain
module with their own.
I don't believe this would work. If there's another langchain
module that import langchain
ends up resolving to, then I think none of the langchain_core
code gets executed and there's no way any addition to it could warn or prevent the situation.
It's possible I might be misunderstanding the solution you are proposing? If so, I'd love to see a PR and a test that shows how it would work.
@obi1kenobi something along the lines of https://github.com/langchain-ai/langchain/commit/8d5e110e433152b8cdc94dd0e51c5c5640165451, but w/ support for debug & llm_cache and maybe better module detection. really we're detecting a problematic user environment and trying to guide them to fix it. the warning should arguably be an informative error.
The user might be running a script like the following:
import langchain
print(langchain.verbose)
Right?
If so, the suggested code you added in langchain_core
never gets hit, because langchain_core
never gets imported.
The user's import langchain
imports the wrong thing and everything goes astray immediately. Nothing in the "real" langchain
or langchain_core
packages gets a chance to execute to prevent the problem or warn about it.
Or am I still misunderstanding the situation, and perhaps the end user is running import langchain_core
directly somehow?
you're right, all bets are off if a user does import langchain
and get their own module instead of langchain-ai/langchain.
however, i ran into this w/ code that was using import langchain_core
and tripping over a langchain/
in my working directory.
Interesting! Will defer to @hwchase17 in that case, since I've been a bit out of the loop on things here and I'm not sure if import langchain_core
is an expected pattern for users or if things are generally expected to go via import langchain
instead.
I got similar errors while trying to integrate prompt flow and langchain for a project. Precisely, I got AttributeErrors for these attributes: langchain.verbose
, langchain.debug
, and langchain.llm_cache
In my case, this seems be solveable by extending the exception handling for setting the globals to also catch attribute errors.
Precisely, line 73 would instead be
except (ImportError, AttributeError):
This needs to be at least done for all 3 get functions in that file. I have not tested it a lot, and I am unsure if this has any side effects, but currently it seems to be working for me.
@obi1kenobi the
import langchain
inget_verbose
andget_debug
is pulling in the user'slangchain
module.β echo 'print("this is the wrong langchain")' > langchain.py β poetry run python -c 'from langchain_core.globals import get_verbose; print(get_verbose())' this is the wrong langchain Traceback (most recent call last): File "<string>", line 1, in <module> File ".../lib/python3.12/site-packages/langchain_core/globals/__init__.py", line 72, in get_verbose old_verbose = langchain.verbose ^^^^^^^^^^^^^^^^^ AttributeError: module 'langchain' has no attribute 'verbose'
this will also happen for
get_llm_cache
.the
langchain_core/globals/__init__.py
code couldexcept AttributeError
to protect from this. it could also raise a warning to the user that they're masking thelangchain
module with their own.i ran into this because i had a clone of langchain in my current working directory.
Thank you so much!
I have my own langchain
folder in my working dir too and python was confused!
Renaming
langchain.py
tomain.py
fixed it!
This method works for me
System Info
274 Mac M2
this error appears often and unexpectedly but gets solved temporarily by running a force reinstall
pip install --upgrade --force-reinstall --no-deps --no-cache-dir langchain
full error
main librairies in the project requests streamlit pandas colorlog python-dotenv tqdm fastapi uvicorn langchain openai tiktoken chromadb pypdf colorlog logger docx2txt
Who can help?
@hwchase17
Information
Related Components
Reproduction
intermittent - no pattern
Expected behavior
resolution