langchain-ai / langchain

πŸ¦œπŸ”— Build context-aware reasoning applications
https://python.langchain.com
MIT License
88.48k stars 13.89k forks source link

AttributeError: module 'langchain' has no attribute 'verbose' #9854

Open urbanscribe opened 10 months ago

urbanscribe commented 10 months ago

System Info

274 Mac M2

this error appears often and unexpectedly but gets solved temporarily by running a force reinstall

pip install --upgrade --force-reinstall --no-deps --no-cache-dir langchain

full error

2023-08-28 08:16:48.197 Uncaught app exception
Traceback (most recent call last):
  File "/Users/user/Developer/newfilesystem/venv/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
  File "/Users/user/Developer/newfilesystem/pages/chat.py", line 104, in <module>
    llm = ChatOpenAI(
  File "/Users/user/Developer/newfilesystem/venv/lib/python3.10/site-packages/langchain/load/serializable.py", line 74, in __init__
    super().__init__(**kwargs)
  File "pydantic/main.py", line 339, in pydantic.main.BaseModel.__init__
  File "pydantic/main.py", line 1066, in pydantic.main.validate_model
  File "pydantic/fields.py", line 439, in pydantic.fields.ModelField.get_default
  File "/Users/user/Developer/newfilesystem/venv/lib/python3.10/site-packages/langchain/chat_models/base.py", line 49, in _get_verbosity
    return langchain.verbose

main librairies in the project requests streamlit pandas colorlog python-dotenv tqdm fastapi uvicorn langchain openai tiktoken chromadb pypdf colorlog logger docx2txt

Who can help?

@hwchase17

Information

Related Components

Reproduction

intermittent - no pattern

Expected behavior

resolution

obi1kenobi commented 10 months ago

This sounds like another dependency might be overwriting your installed version of langchain with a different version.

Next time you encounter this error, before you do the force-reinstall step, could you run a few commands and report their output here?

This will help us figure out why the langchain version is being overwritten and by which other package.

urbanscribe commented 10 months ago

for sure here you go

(venv) user@Mac-Studio newfilesystem % pip freeze | grep langchain
langchain==0.0.275
(venv) user@Mac-Studio newfilesystem % pip install pipdeptree && pipdeptree --reverse

Collecting pipdeptree
  Downloading pipdeptree-2.13.0-py3-none-any.whl (26 kB)
Installing collected packages: pipdeptree
Successfully installed pipdeptree-2.13.0
WARNING: You are using pip version 22.0.4; however, version 23.2.1 is available.
You should consider upgrading via the '/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/bin/python -m pip install --upgrade pip' command.
annotated-types==0.5.0
async-timeout==4.0.3
β”œβ”€β”€ aiohttp==3.8.5 [requires: async-timeout>=4.0.0a3,<5.0]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β”‚   └── openai==0.27.9 [requires: aiohttp]
└── langchain==0.0.275 [requires: async-timeout>=4.0.0,<5.0.0]
attrs==23.1.0
β”œβ”€β”€ aiohttp==3.8.5 [requires: attrs>=17.3.0]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β”‚   └── openai==0.27.9 [requires: aiohttp]
β”œβ”€β”€ jsonschema==4.19.0 [requires: attrs>=22.2.0]
β”‚   └── altair==5.0.1 [requires: jsonschema>=3.0]
β”‚       └── streamlit==1.26.0 [requires: altair>=4.0,<6]
└── referencing==0.30.2 [requires: attrs>=22.2.0]
    β”œβ”€β”€ jsonschema==4.19.0 [requires: referencing>=0.28.4]
    β”‚   └── altair==5.0.1 [requires: jsonschema>=3.0]
    β”‚       └── streamlit==1.26.0 [requires: altair>=4.0,<6]
    └── jsonschema-specifications==2023.7.1 [requires: referencing>=0.28.0]
        └── jsonschema==4.19.0 [requires: jsonschema-specifications>=2023.03.6]
            └── altair==5.0.1 [requires: jsonschema>=3.0]
                └── streamlit==1.26.0 [requires: altair>=4.0,<6]
backoff==2.2.1
└── posthog==3.0.2 [requires: backoff>=1.10.0]
    └── chromadb==0.4.7 [requires: posthog>=2.4.0]
bcrypt==4.0.1
└── chromadb==0.4.7 [requires: bcrypt>=4.0.1]
blinker==1.6.2
└── streamlit==1.26.0 [requires: blinker>=1.0.0,<2]
cachetools==5.3.1
└── streamlit==1.26.0 [requires: cachetools>=4.0,<6]
certifi==2023.7.22
β”œβ”€β”€ pulsar-client==3.2.0 [requires: certifi]
β”‚   └── chromadb==0.4.7 [requires: pulsar-client>=3.1.0]
└── requests==2.31.0 [requires: certifi>=2017.4.17]
    β”œβ”€β”€ chromadb==0.4.7 [requires: requests>=2.28]
    β”œβ”€β”€ langchain==0.0.275 [requires: requests>=2,<3]
    β”œβ”€β”€ langsmith==0.0.27 [requires: requests>=2,<3]
    β”‚   └── langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
    β”œβ”€β”€ openai==0.27.9 [requires: requests>=2.20]
    β”œβ”€β”€ posthog==3.0.2 [requires: requests>=2.7,<3.0]
    β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
    β”œβ”€β”€ streamlit==1.26.0 [requires: requests>=2.18,<3]
    └── tiktoken==0.4.0 [requires: requests>=2.26.0]
charset-normalizer==3.2.0
β”œβ”€β”€ aiohttp==3.8.5 [requires: charset-normalizer>=2.0,<4.0]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β”‚   └── openai==0.27.9 [requires: aiohttp]
└── requests==2.31.0 [requires: charset-normalizer>=2,<4]
    β”œβ”€β”€ chromadb==0.4.7 [requires: requests>=2.28]
    β”œβ”€β”€ langchain==0.0.275 [requires: requests>=2,<3]
    β”œβ”€β”€ langsmith==0.0.27 [requires: requests>=2,<3]
    β”‚   └── langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
    β”œβ”€β”€ openai==0.27.9 [requires: requests>=2.20]
    β”œβ”€β”€ posthog==3.0.2 [requires: requests>=2.7,<3.0]
    β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
    β”œβ”€β”€ streamlit==1.26.0 [requires: requests>=2.18,<3]
    └── tiktoken==0.4.0 [requires: requests>=2.26.0]
click==8.1.7
β”œβ”€β”€ streamlit==1.26.0 [requires: click>=7.0,<9]
└── uvicorn==0.23.2 [requires: click>=7.0]
    └── chromadb==0.4.7 [requires: uvicorn>=0.18.3]
colorlog==6.7.0
docx2txt==0.8
exceptiongroup==1.1.3
└── anyio==3.7.1 [requires: exceptiongroup]
    β”œβ”€β”€ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
    β”‚   └── fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
    β”‚       └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
    └── watchfiles==0.20.0 [requires: anyio>=3.0.0]
flatbuffers==23.5.26
└── onnxruntime==1.15.1 [requires: flatbuffers]
    └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
frozenlist==1.4.0
β”œβ”€β”€ aiohttp==3.8.5 [requires: frozenlist>=1.1.1]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β”‚   └── openai==0.27.9 [requires: aiohttp]
└── aiosignal==1.3.1 [requires: frozenlist>=1.1.0]
    └── aiohttp==3.8.5 [requires: aiosignal>=1.1.2]
        β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
        └── openai==0.27.9 [requires: aiohttp]
greenlet==2.0.2
└── SQLAlchemy==2.0.20 [requires: greenlet!=0.4.17]
    └── langchain==0.0.275 [requires: SQLAlchemy>=1.4,<3]
h11==0.14.0
└── uvicorn==0.23.2 [requires: h11>=0.8]
    └── chromadb==0.4.7 [requires: uvicorn>=0.18.3]
httptools==0.6.0
humanfriendly==10.0
└── coloredlogs==15.0.1 [requires: humanfriendly>=9.1]
    └── onnxruntime==1.15.1 [requires: coloredlogs]
        └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
idna==3.4
β”œβ”€β”€ anyio==3.7.1 [requires: idna>=2.8]
β”‚   β”œβ”€β”€ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
β”‚   β”‚   └── fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β”‚   β”‚       └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β”‚   └── watchfiles==0.20.0 [requires: anyio>=3.0.0]
β”œβ”€β”€ requests==2.31.0 [requires: idna>=2.5,<4]
β”‚   β”œβ”€β”€ chromadb==0.4.7 [requires: requests>=2.28]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: requests>=2,<3]
β”‚   β”œβ”€β”€ langsmith==0.0.27 [requires: requests>=2,<3]
β”‚   β”‚   └── langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
β”‚   β”œβ”€β”€ openai==0.27.9 [requires: requests>=2.20]
β”‚   β”œβ”€β”€ posthog==3.0.2 [requires: requests>=2.7,<3.0]
β”‚   β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
β”‚   β”œβ”€β”€ streamlit==1.26.0 [requires: requests>=2.18,<3]
β”‚   └── tiktoken==0.4.0 [requires: requests>=2.26.0]
└── yarl==1.9.2 [requires: idna>=2.0]
    └── aiohttp==3.8.5 [requires: yarl>=1.0,<2.0]
        β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
        └── openai==0.27.9 [requires: aiohttp]
logger==1.4
MarkupSafe==2.1.3
└── Jinja2==3.1.2 [requires: MarkupSafe>=2.0]
    β”œβ”€β”€ altair==5.0.1 [requires: Jinja2]
    β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
    └── pydeck==0.8.0 [requires: Jinja2>=2.10.1]
        └── streamlit==1.26.0 [requires: pydeck>=0.8,<1]
mdurl==0.1.2
└── markdown-it-py==3.0.0 [requires: mdurl~=0.1]
    └── rich==13.5.2 [requires: markdown-it-py>=2.2.0]
        └── streamlit==1.26.0 [requires: rich>=10.14.0,<14]
monotonic==1.6
└── posthog==3.0.2 [requires: monotonic>=1.5]
    └── chromadb==0.4.7 [requires: posthog>=2.4.0]
mpmath==1.3.0
└── sympy==1.12 [requires: mpmath>=0.19]
    └── onnxruntime==1.15.1 [requires: sympy]
        └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
multidict==6.0.4
β”œβ”€β”€ aiohttp==3.8.5 [requires: multidict>=4.5,<7.0]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
β”‚   └── openai==0.27.9 [requires: aiohttp]
└── yarl==1.9.2 [requires: multidict>=4.0]
    └── aiohttp==3.8.5 [requires: yarl>=1.0,<2.0]
        β”œβ”€β”€ langchain==0.0.275 [requires: aiohttp>=3.8.3,<4.0.0]
        └── openai==0.27.9 [requires: aiohttp]
mypy-extensions==1.0.0
└── typing-inspect==0.9.0 [requires: mypy-extensions>=0.3.0]
    └── dataclasses-json==0.5.14 [requires: typing-inspect>=0.4.0,<1]
        └── langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
numpy==1.25.2
β”œβ”€β”€ altair==5.0.1 [requires: numpy]
β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
β”œβ”€β”€ chroma-hnswlib==0.7.2 [requires: numpy]
β”‚   └── chromadb==0.4.7 [requires: chroma-hnswlib==0.7.2]
β”œβ”€β”€ chromadb==0.4.7 [requires: numpy>=1.21.6]
β”œβ”€β”€ langchain==0.0.275 [requires: numpy>=1,<2]
β”œβ”€β”€ numexpr==2.8.5 [requires: numpy>=1.13.3]
β”‚   └── langchain==0.0.275 [requires: numexpr>=2.8.4,<3.0.0]
β”œβ”€β”€ onnxruntime==1.15.1 [requires: numpy>=1.21.6]
β”‚   └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
β”œβ”€β”€ pandas==2.0.3 [requires: numpy>=1.20.3]
β”‚   β”œβ”€β”€ altair==5.0.1 [requires: pandas>=0.18]
β”‚   β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
β”‚   └── streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
β”œβ”€β”€ pyarrow==13.0.0 [requires: numpy>=1.16.6]
β”‚   └── streamlit==1.26.0 [requires: pyarrow>=6.0]
β”œβ”€β”€ pydeck==0.8.0 [requires: numpy>=1.16.4]
β”‚   └── streamlit==1.26.0 [requires: pydeck>=0.8,<1]
└── streamlit==1.26.0 [requires: numpy>=1.19.3,<2]
overrides==7.4.0
└── chromadb==0.4.7 [requires: overrides>=7.3.1]
packaging==23.1
β”œβ”€β”€ marshmallow==3.20.1 [requires: packaging>=17.0]
β”‚   └── dataclasses-json==0.5.14 [requires: marshmallow>=3.18.0,<4.0.0]
β”‚       └── langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
β”œβ”€β”€ onnxruntime==1.15.1 [requires: packaging]
β”‚   └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
└── streamlit==1.26.0 [requires: packaging>=16.8,<24]
Pillow==9.5.0
└── streamlit==1.26.0 [requires: Pillow>=7.1.0,<10]
pip==22.0.4
pipdeptree==2.13.0
protobuf==4.24.2
β”œβ”€β”€ onnxruntime==1.15.1 [requires: protobuf]
β”‚   └── chromadb==0.4.7 [requires: onnxruntime>=1.14.1]
└── streamlit==1.26.0 [requires: protobuf>=3.20,<5]
Pygments==2.16.1
└── rich==13.5.2 [requires: Pygments>=2.13.0,<3.0.0]
    └── streamlit==1.26.0 [requires: rich>=10.14.0,<14]
Pympler==1.0.1
└── streamlit==1.26.0 [requires: Pympler>=0.9,<2]
PyPika==0.48.9
└── chromadb==0.4.7 [requires: PyPika>=0.48.9]
python-dotenv==1.0.0
pytz==2023.3
└── pandas==2.0.3 [requires: pytz>=2020.1]
    β”œβ”€β”€ altair==5.0.1 [requires: pandas>=0.18]
    β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
    └── streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
PyYAML==6.0.1
└── langchain==0.0.275 [requires: PyYAML>=5.3]
regex==2023.8.8
└── tiktoken==0.4.0 [requires: regex>=2022.1.18]
rpds-py==0.10.0
β”œβ”€β”€ jsonschema==4.19.0 [requires: rpds-py>=0.7.1]
β”‚   └── altair==5.0.1 [requires: jsonschema>=3.0]
β”‚       └── streamlit==1.26.0 [requires: altair>=4.0,<6]
└── referencing==0.30.2 [requires: rpds-py>=0.7.0]
    β”œβ”€β”€ jsonschema==4.19.0 [requires: referencing>=0.28.4]
    β”‚   └── altair==5.0.1 [requires: jsonschema>=3.0]
    β”‚       └── streamlit==1.26.0 [requires: altair>=4.0,<6]
    └── jsonschema-specifications==2023.7.1 [requires: referencing>=0.28.0]
        └── jsonschema==4.19.0 [requires: jsonschema-specifications>=2023.03.6]
            └── altair==5.0.1 [requires: jsonschema>=3.0]
                └── streamlit==1.26.0 [requires: altair>=4.0,<6]
setuptools==58.1.0
six==1.16.0
β”œβ”€β”€ posthog==3.0.2 [requires: six>=1.5]
β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
└── python-dateutil==2.8.2 [requires: six>=1.5]
    β”œβ”€β”€ pandas==2.0.3 [requires: python-dateutil>=2.8.2]
    β”‚   β”œβ”€β”€ altair==5.0.1 [requires: pandas>=0.18]
    β”‚   β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
    β”‚   └── streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
    β”œβ”€β”€ posthog==3.0.2 [requires: python-dateutil>2.1]
    β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
    └── streamlit==1.26.0 [requires: python-dateutil>=2.7.3,<3]
smmap==5.0.0
└── gitdb==4.0.10 [requires: smmap>=3.0.1,<6]
    └── GitPython==3.1.32 [requires: gitdb>=4.0.1,<5]
        └── streamlit==1.26.0 [requires: GitPython>=3.0.7,<4,!=3.1.19]
sniffio==1.3.0
└── anyio==3.7.1 [requires: sniffio>=1.1]
    β”œβ”€β”€ starlette==0.27.0 [requires: anyio>=3.4.0,<5]
    β”‚   └── fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
    β”‚       └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
    └── watchfiles==0.20.0 [requires: anyio>=3.0.0]
tenacity==8.2.3
β”œβ”€β”€ langchain==0.0.275 [requires: tenacity>=8.1.0,<9.0.0]
└── streamlit==1.26.0 [requires: tenacity>=8.1.0,<9]
tokenizers==0.13.3
└── chromadb==0.4.7 [requires: tokenizers>=0.13.2]
toml==0.10.2
└── streamlit==1.26.0 [requires: toml>=0.10.1,<2]
toolz==0.12.0
└── altair==5.0.1 [requires: toolz]
    └── streamlit==1.26.0 [requires: altair>=4.0,<6]
tornado==6.3.3
└── streamlit==1.26.0 [requires: tornado>=6.0.3,<7]
tqdm==4.66.1
β”œβ”€β”€ chromadb==0.4.7 [requires: tqdm>=4.65.0]
└── openai==0.27.9 [requires: tqdm]
typing-extensions==4.7.1
β”œβ”€β”€ altair==5.0.1 [requires: typing-extensions>=4.0.1]
β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
β”œβ”€β”€ chromadb==0.4.7 [requires: typing-extensions>=4.5.0]
β”œβ”€β”€ fastapi==0.99.1 [requires: typing-extensions>=4.5.0]
β”‚   └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β”œβ”€β”€ pydantic==1.10.12 [requires: typing-extensions>=4.2.0]
β”‚   β”œβ”€β”€ chromadb==0.4.7 [requires: pydantic>=1.9,<2.0]
β”‚   β”œβ”€β”€ fastapi==0.99.1 [requires: pydantic>=1.7.4,<2.0.0,!=1.8.1,!=1.8]
β”‚   β”‚   └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β”‚   β”œβ”€β”€ langchain==0.0.275 [requires: pydantic>=1,<3]
β”‚   └── langsmith==0.0.27 [requires: pydantic>=1,<3]
β”‚       └── langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
β”œβ”€β”€ pydantic-core==2.6.3 [requires: typing-extensions>=4.6.0,!=4.7.0]
β”œβ”€β”€ pypdf==3.15.4 [requires: typing-extensions>=3.7.4.3]
β”œβ”€β”€ SQLAlchemy==2.0.20 [requires: typing-extensions>=4.2.0]
β”‚   └── langchain==0.0.275 [requires: SQLAlchemy>=1.4,<3]
β”œβ”€β”€ starlette==0.27.0 [requires: typing-extensions>=3.10.0]
β”‚   └── fastapi==0.99.1 [requires: starlette>=0.27.0,<0.28.0]
β”‚       └── chromadb==0.4.7 [requires: fastapi>=0.95.2,<0.100.0]
β”œβ”€β”€ streamlit==1.26.0 [requires: typing-extensions>=4.1.0,<5]
β”œβ”€β”€ typing-inspect==0.9.0 [requires: typing-extensions>=3.7.4]
β”‚   └── dataclasses-json==0.5.14 [requires: typing-inspect>=0.4.0,<1]
β”‚       └── langchain==0.0.275 [requires: dataclasses-json>=0.5.7,<0.6.0]
└── uvicorn==0.23.2 [requires: typing-extensions>=4.0]
    └── chromadb==0.4.7 [requires: uvicorn>=0.18.3]
tzdata==2023.3
β”œβ”€β”€ pandas==2.0.3 [requires: tzdata>=2022.1]
β”‚   β”œβ”€β”€ altair==5.0.1 [requires: pandas>=0.18]
β”‚   β”‚   └── streamlit==1.26.0 [requires: altair>=4.0,<6]
β”‚   └── streamlit==1.26.0 [requires: pandas>=1.3.0,<3]
└── pytz-deprecation-shim==0.1.0.post0 [requires: tzdata]
    └── tzlocal==4.3.1 [requires: pytz-deprecation-shim]
        └── streamlit==1.26.0 [requires: tzlocal>=1.1,<5]
urllib3==2.0.4
└── requests==2.31.0 [requires: urllib3>=1.21.1,<3]
    β”œβ”€β”€ chromadb==0.4.7 [requires: requests>=2.28]
    β”œβ”€β”€ langchain==0.0.275 [requires: requests>=2,<3]
    β”œβ”€β”€ langsmith==0.0.27 [requires: requests>=2,<3]
    β”‚   └── langchain==0.0.275 [requires: langsmith>=0.0.21,<0.1.0]
    β”œβ”€β”€ openai==0.27.9 [requires: requests>=2.20]
    β”œβ”€β”€ posthog==3.0.2 [requires: requests>=2.7,<3.0]
    β”‚   └── chromadb==0.4.7 [requires: posthog>=2.4.0]
    β”œβ”€β”€ streamlit==1.26.0 [requires: requests>=2.18,<3]
    └── tiktoken==0.4.0 [requires: requests>=2.26.0]
uvloop==0.17.0
validators==0.21.2
└── streamlit==1.26.0 [requires: validators>=0.2,<1]
websockets==11.0.3
zipp==3.16.2
β”œβ”€β”€ importlib-metadata==6.8.0 [requires: zipp>=0.5]
β”‚   └── streamlit==1.26.0 [requires: importlib-metadata>=1.4,<7]
└── importlib-resources==6.0.1 [requires: zipp>=3.1.0]
    └── chromadb==0.4.7 [requires: importlib-resources]
obi1kenobi commented 10 months ago

Very unusual! Thanks for sending that over.

Would you mind adding a few lines of code just before the crashing line in your app and posting what those lines print out when the app crashes?

Add the new code right before this line in your app:

File "/Users/user/Developer/newfilesystem/pages/chat.py", line 104, in <module>
    llm = ChatOpenAI(

Here are the lines to add:

import langchain
print("*** Running langchain version:", langchain.__version__)
print("*** langchain contents:", dir(langchain))

If your app does logging in a way such that print statement output is not captured, feel free to amend the lines to redirect their output as needed to capture it right before the app crashes with the error you posted.

urbanscribe commented 10 months ago

sure thing

here is the output

AttributeError: module 'langchain' has no attribute 'verbose' Running langchain version: 0.0.275 langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', 'all', 'annotations', 'builtins', 'cached', 'doc', 'file', 'loader', 'name', 'package', 'path', 'spec', 'version', 'cache', 'debug', 'document_loaders', 'document_transformers', 'embeddings', 'llm_cache', 'retrievers', 'vectorstores', 'verbose']

obi1kenobi commented 10 months ago

Something very very strange is happening here, and I'm sure you can see it too: the langchain contents line lists 'verbose' as one of the attributes that exists on langchain, and yet the AttributeError says module 'langchain' has no attribute 'verbose'.

Could I get you to add one more printing line next to the ones you already added?

print("*** langchain verbose value:", getattr(langchain, "verbose", "<non existent>"))

Also, could you scan your application code to make sure you never do anything like del langchain["verbose"] or something like that? It would be a strange thing to do, but we're in strange territory already so I have to ask πŸ˜…

urbanscribe commented 10 months ago

*** langchain verbose value: False

pretty consistently

KeyError: 'langchain' Running langchain version: 0.0.276 langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', 'all', 'annotations', 'builtins', 'cached', 'doc', 'file', 'loader', 'name', 'package', 'path', 'spec', 'version', 'adapters', 'agents', 'base_language', 'cache', 'chains', 'chat_loaders', 'chat_models', 'debug', 'document_loaders', 'document_transformers', 'embeddings', 'graphs', 'llm_cache', 'llms', 'memory', 'output_parsers', 'prompts', 'requests', 'retrievers', 'text_splitter', 'tools', 'vectorstores', 'verbose'] *** langchain verbose value: False

Name: pydantic Version: 1.10.12 Summary: Data validation and settings management using python type hints Home-page: https://github.com/pydantic/pydantic Author: Samuel Colvin Author-email: s@muelcolvin.com License: MIT Location: /Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages Requires: typing-extensions Required-by: chainlit, chromadb, fastapi, langchain, langsmith, openapi-schema-pydantic, openbb, prisma (venv) alexfuchs@Mac-Studio newfilesystem % pip show langchain Name: langchain Version: 0.0.276 Summary: Building applications with LLMs through composability Home-page: https://github.com/langchain-ai/langchain Author: Author-email: License: MIT Location: /Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages Requires: aiohttp, async-timeout, dataclasses-json, langsmith, numexpr, numpy, pydantic, PyYAML, requests, SQLAlchemy, tenacity Required-by: llama-index, openbb

urbanscribe commented 10 months ago

new intermittent

/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script exec(code, module.dict) File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/pages/chatNEW.py", line 248, in main() File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/pages/chatNEW.py", line 244, in main chat_interface.run_chat() File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/pages/chatNEW.py", line 148, in run_chat response = self.process_message(user_query) File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/pages/chatNEW.py", line 132, in process_message res = self.agent.run(input=message) File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/langchain/chains/base.py", line 480, in run return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[ File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/langchain/chains/base.py", line 260, in call callback_manager = CallbackManager.configure( File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/langchain/callbacks/manager.py", line 1334, in configure return _configure( File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/langchain/callbacks/manager.py", line 1740, in _configure debug = _get_debug() File "/Volumes/ScratchDiskNVME/Best_Developer/newfilesystem/venv/lib/python3.9/site-packages/langchain/callbacks/manager.py", line 87, in _get_debug return langchain.debug AttributeError: module 'langchain' has no attribute 'debug'

urbanscribe commented 10 months ago

FYI pretty much have to run pip install --upgrade --force-reinstall --no-deps --no-cache-dir langchain every time when starting the streamlit app.

obi1kenobi commented 10 months ago

Is it possible that the app is running inside a different virtualenv than where you added the print statements? I think something like that is the only possible answer here because of the printout value you showed earlier:

*** langchain verbose value: False

This was obtained by reading langchain.verbose, which is the value that the crashing code says doesn't exist. But it clearly exists with value False here, so it has to be the case that the error is coming from some other process which is invoking langchain in a different (probably outdated) virtualenv.

Going back through your previous messages, it seems like there might indeed be two virtualenvs at different paths and with different versions: one in python 3.9 and one in python 3.10.

Could you look through your machine (and your previous messages here) and make sure to run the pip freeze and pipdeptree commands in both virtualenvs? Here are the full commands one more time, for your convenience:

pip freeze | grep langchain
pip install pipdeptree && pipdeptree --reverse
yuqie commented 9 months ago

@obi1kenobi Hi, I also enconter this issue, I got verbose= true, but when I initialize an agent, I got AttributeError: module 'langchain' has no attribute 'verbose'

import langchain
print("*** Running langchain version:", langchain.__version__)
print("*** langchain contents:", dir(langchain))
*** Running langchain version: 0.0.239
*** langchain contents: ['Anthropic', 'ArxivAPIWrapper', 'Banana', 'BaseCache', 'BasePromptTemplate', 'CerebriumAI', 'Cohere', 'ConversationChain', 'ElasticVectorSearch', 'FAISS', 'FewShotPromptTemplate', 'ForefrontAI', 'GoldenQueryAPIWrapper', 'GoogleSearchAPIWrapper', 'GoogleSerperAPIWrapper', 'GooseAI', 'HuggingFaceHub', 'HuggingFacePipeline', 'HuggingFaceTextGenInference', 'InMemoryDocstore', 'LLMBashChain', 'LLMChain', 'LLMCheckerChain', 'LLMMathChain', 'LlamaCpp', 'MRKLChain', 'Modal', 'OpenAI', 'Optional', 'PALChain', 'Petals', 'PipelineAI', 'PowerBIDataset', 'Prompt', 'PromptTemplate', 'QAWithSourcesChain', 'ReActChain', 'SQLDatabase', 'SQLDatabaseChain', 'SagemakerEndpoint', 'SearxSearchWrapper', 'SelfAskWithSearchChain', 'SerpAPIChain', 'SerpAPIWrapper', 'StochasticAI', 'VectorDBQA', 'VectorDBQAWithSourcesChain', 'Wikipedia', 'WikipediaAPIWrapper', 'WolframAlphaAPIWrapper', 'Writer', '__all__', '__annotations__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__version__', 'agents', 'cache', 'debug', 'llm_cache', 'verbose']

print("*** langchain verbose value:", getattr(langchain, "verbose", "<non existent>"))
*** langchain verbose value: True

I use langchain V0.0.239, and the code is as following:

from langchain.llms import LlamaCpp
from langchain.agents import load_tools
from langchain.agents import initialize_agent

llms = LlamaCpp(model_path="/home/7B/ggml-model-f16.gguf")
tools = load_tools(["serpapi",], llm=llms)
agent = initialize_agent(tools, llms, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)

the error

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[41], line 1
----> 1 agent = initialize_agent(tools, llms, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/agents/initialize.py:57, in initialize_agent(tools, llm, agent, callback_manager, agent_path, agent_kwargs, tags, **kwargs)
     55     agent_cls = AGENT_TO_CLASS[agent]
     56     agent_kwargs = agent_kwargs or {}
---> 57     agent_obj = agent_cls.from_llm_and_tools(
     58         llm, tools, callback_manager=callback_manager, **agent_kwargs
     59     )
     60 elif agent_path is not None:
     61     agent_obj = load_agent(
     62         agent_path, llm=llm, tools=tools, callback_manager=callback_manager
     63     )

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/agents/mrkl/base.py:110, in ZeroShotAgent.from_llm_and_tools(cls, llm, tools, callback_manager, output_parser, prefix, suffix, format_instructions, input_variables, **kwargs)
    102 cls._validate_tools(tools)
    103 prompt = cls.create_prompt(
    104     tools,
    105     prefix=prefix,
   (...)
    108     input_variables=input_variables,
    109 )
--> 110 llm_chain = LLMChain(
    111     llm=llm,
    112     prompt=prompt,
    113     callback_manager=callback_manager,
    114 )
    115 tool_names = [tool.name for tool in tools]
    116 _output_parser = output_parser or cls._get_default_output_parser()

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/load/serializable.py:74, in Serializable.__init__(self, **kwargs)
     73 def __init__(self, **kwargs: Any) -> None:
---> 74     super().__init__(**kwargs)
     75     self._lc_kwargs = kwargs

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/main.py:339, in pydantic.main.BaseModel.__init__()

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/main.py:1066, in pydantic.main.validate_model()

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/pydantic/fields.py:439, in pydantic.fields.ModelField.get_default()

File ~/anaconda3/envs/langchain/lib/python3.9/site-packages/langchain/chains/base.py:30, in _get_verbosity()
     29 def _get_verbosity() -> bool:
---> 30     return langchain.verbose

AttributeError: module 'langchain' has no attribute 'verbose'

It's werid that I didn't change anything, and the code could run without error yesterday, but run with error when I reload the LLM model today.

obi1kenobi commented 9 months ago

This is very strange, and I haven't been able to replicate this on my end no matter what I try 😞

Any chance you can try to construct a minimal repro that triggers the problem on a clean installation of Python or anaconda, such as inside a Docker container? There's clearly some kind of problem here, but since it's only happening on your end, I'm not able to debug it or even figure out which combination of Python, anaconda, something around the model, or langchain itself is contributing to the issue popping up.

obi1kenobi commented 9 months ago

While I still don't know what's causing this, I opened #11311 recently with some refactoring that should make a few of the possible underlying causes go away. After that PR gets merged and released, I'd love your help with testing to see if it perhaps makes this issue go away 🀞

Sorry we haven't been able to get to the bottom of this yet, and thanks for bearing with us as we try to figure out what's going on!

urbanscribe commented 9 months ago

I'm also getting module 'langchain' has no attribute 'debug'

it happens more often if there are two streamlit front ends open on the same backend. but it does happen when there is a single one.

hwchase17 commented 9 months ago

can you try putting import langchain at the top of your script? i think theres some funkiness with the way streamlit loads modules. if you make sure to load langchain fully (by just putting import langchain at the top) that usually fixes it

IMK-Stefan commented 8 months ago

I have a related issue, where using langchain 0.0.311, I always get the following error: AttributeError: type object 'APIOperation' has no attribute 'from_openapi_url' (although following the path in VS Code clearly shows the code for the attribute)

After removing langchain in my poetry virtualenv and installing specifically langchain=0.0.232, the error disappeared. I chose this version because I'm successfully using it in one of my other projects.

aisu-programming commented 5 months ago

Same issue using langchain 0.1.4 on Windows 10 / CUDA 11.8 / Python 3.10.2. Errors with attributes debug, verbose, and llm_cache. llm_cache can't be bool so I cannot just setting it to True or False.

aisu-programming commented 5 months ago

https://github.com/langchain-ai/langchain/issues/2079#issuecomment-1487416187

Found the comment here and found that I'm not a smart man either.πŸ˜…

sxflynn commented 4 months ago

Renaming langchain.py to main.py fixed it!

mattf commented 3 months ago

@obi1kenobi the import langchain in get_verbose and get_debug is pulling in the user's langchain module.

➜ echo 'print("this is the wrong langchain")' > langchain.py
➜ poetry run python -c 'from langchain_core.globals import get_verbose; print(get_verbose())'
this is the wrong langchain
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File ".../lib/python3.12/site-packages/langchain_core/globals/__init__.py", line 72, in get_verbose
    old_verbose = langchain.verbose
                  ^^^^^^^^^^^^^^^^^
AttributeError: module 'langchain' has no attribute 'verbose'

this will also happen for get_llm_cache.

the langchain_core/globals/__init__.py code could except AttributeError to protect from this. it could also raise a warning to the user that they're masking the langchain module with their own.

i ran into this because i had a clone of langchain in my current working directory.

obi1kenobi commented 3 months ago

the langchain_core/globals/__init__.py code could except AttributeError to protect from this. it could also raise a warning to the user that they're masking the langchain module with their own.

I don't believe this would work. If there's another langchain module that import langchain ends up resolving to, then I think none of the langchain_core code gets executed and there's no way any addition to it could warn or prevent the situation.

It's possible I might be misunderstanding the solution you are proposing? If so, I'd love to see a PR and a test that shows how it would work.

mattf commented 3 months ago

@obi1kenobi something along the lines of https://github.com/langchain-ai/langchain/commit/8d5e110e433152b8cdc94dd0e51c5c5640165451, but w/ support for debug & llm_cache and maybe better module detection. really we're detecting a problematic user environment and trying to guide them to fix it. the warning should arguably be an informative error.

obi1kenobi commented 3 months ago

The user might be running a script like the following:

import langchain

print(langchain.verbose)

Right?

If so, the suggested code you added in langchain_core never gets hit, because langchain_core never gets imported.

The user's import langchain imports the wrong thing and everything goes astray immediately. Nothing in the "real" langchain or langchain_core packages gets a chance to execute to prevent the problem or warn about it.

Or am I still misunderstanding the situation, and perhaps the end user is running import langchain_core directly somehow?

mattf commented 3 months ago

you're right, all bets are off if a user does import langchain and get their own module instead of langchain-ai/langchain.

however, i ran into this w/ code that was using import langchain_core and tripping over a langchain/ in my working directory.

obi1kenobi commented 3 months ago

Interesting! Will defer to @hwchase17 in that case, since I've been a bit out of the loop on things here and I'm not sure if import langchain_core is an expected pattern for users or if things are generally expected to go via import langchain instead.

laurin-eichberger commented 3 months ago

I got similar errors while trying to integrate prompt flow and langchain for a project. Precisely, I got AttributeErrors for these attributes: langchain.verbose, langchain.debug, and langchain.llm_cache

In my case, this seems be solveable by extending the exception handling for setting the globals to also catch attribute errors.

https://github.com/langchain-ai/langchain/blob/71d0981f18dd2613261ab72d584b674c0cf180fa/libs/core/langchain_core/globals/__init__.py#L47-L77

Precisely, line 73 would instead be

except (ImportError, AttributeError):

This needs to be at least done for all 3 get functions in that file. I have not tested it a lot, and I am unsure if this has any side effects, but currently it seems to be working for me.

khoadaniel commented 3 months ago

@obi1kenobi the import langchain in get_verbose and get_debug is pulling in the user's langchain module.

➜ echo 'print("this is the wrong langchain")' > langchain.py
➜ poetry run python -c 'from langchain_core.globals import get_verbose; print(get_verbose())'
this is the wrong langchain
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File ".../lib/python3.12/site-packages/langchain_core/globals/__init__.py", line 72, in get_verbose
    old_verbose = langchain.verbose
                  ^^^^^^^^^^^^^^^^^
AttributeError: module 'langchain' has no attribute 'verbose'

this will also happen for get_llm_cache.

the langchain_core/globals/__init__.py code could except AttributeError to protect from this. it could also raise a warning to the user that they're masking the langchain module with their own.

i ran into this because i had a clone of langchain in my current working directory.

Thank you so much! I have my own langchain folder in my working dir too and python was confused!

SkyzcYou commented 1 month ago

Renaming langchain.py to main.py fixed it!

This method works for me