langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
95.64k stars 15.53k forks source link

Circular import when using GoogleSearchAPI wrapper #28432

Open hjenryin opened 2 days ago

hjenryin commented 2 days ago

Checked other resources

Example Code

from langchain_google_community import GoogleSearchAPIWrapper

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/home/hjenryin/MALLM/pf/uiuc-informatics/google.py", line 1, in <module>
    from langchain_google_community import GoogleSearchAPIWrapper
  File "/home/hjenryin/miniconda3/envs/py12/lib/python3.12/site-packages/langchain_google_community/__init__.py", line 1, in <module>
    from langchain_google_community.bigquery import BigQueryLoader
  File "/home/hjenryin/miniconda3/envs/py12/lib/python3.12/site-packages/langchain_google_community/bigquery.py", line 8, in <module>
    from langchain_google_community._utils import get_client_info
  File "/home/hjenryin/miniconda3/envs/py12/lib/python3.12/site-packages/langchain_google_community/_utils.py", line 6, in <module>
    from google.api_core.gapic_v1.client_info import ClientInfo
  File "/home/hjenryin/MALLM/pf/uiuc-informatics/google.py", line 1, in <module>
    from langchain_google_community import GoogleSearchAPIWrapper
ImportError: cannot import name 'GoogleSearchAPIWrapper' from partially initialized module 'langchain_google_community' (most likely due to a circular import) (/home/hjenryin/miniconda3/envs/py12/lib/python3.12/site-packages/langchain_google_community/__init__.py)

Description

When I try to import GoogleSearchAPIWrapper in a newly created python 3.12 environment, it raises a ImportError.

System Info

System Information

OS: Linux OS Version: #1 SMP Fri Mar 29 23:14:13 UTC 2024 Python Version: 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:27:36) [GCC 11.2.0]

Package Information

langchain_core: 0.3.21 langchain: 0.3.9 langchain_community: 0.3.8 langsmith: 0.1.147 langchain_google_community: 2.0.3 langchain_text_splitters: 0.3.2

Optional packages not installed

langserve

Other Dependencies

aiohttp: 3.11.8 async-timeout: Installed. No version info available. beautifulsoup4: Installed. No version info available. dataclasses-json: 0.6.7 db-dtypes: Installed. No version info available. gapic-google-longrunning: Installed. No version info available. google-api-core: 2.23.0 google-api-python-client: 2.154.0 google-auth-httplib2: 0.2.0 google-auth-oauthlib: Installed. No version info available. google-cloud-aiplatform: Installed. No version info available. google-cloud-bigquery: Installed. No version info available. google-cloud-bigquery-storage: Installed. No version info available. google-cloud-contentwarehouse: Installed. No version info available. google-cloud-core: 2.4.1 google-cloud-discoveryengine: Installed. No version info available. google-cloud-documentai: Installed. No version info available. google-cloud-documentai-toolbox: Installed. No version info available. google-cloud-speech: Installed. No version info available. google-cloud-storage: Installed. No version info available. google-cloud-texttospeech: Installed. No version info available. google-cloud-translate: Installed. No version info available. google-cloud-vision: Installed. No version info available. googlemaps: Installed. No version info available. grpcio: 1.68.0 httpx: 0.28.0 httpx-sse: 0.4.0 jsonpatch: 1.33 langsmith-pyo3: Installed. No version info available. numpy: 1.26.4 orjson: 3.10.12 packaging: 24.2 pandas: Installed. No version info available. pyarrow: Installed. No version info available. pydantic: 2.10.2 pydantic-settings: 2.6.1 PyYAML: 6.0.2 requests: 2.32.3 requests-toolbelt: 1.0.0 SQLAlchemy: 2.0.35 tenacity: 9.0.0 typing-extensions: 4.12.2

keenborder786 commented 2 days ago

I tried and it worked for me. Can you make sure you have upgraded to latest versions.

hjenryin commented 2 days ago

That's strange. Could you post your system requirements as well? (By running python -m langchain_core.sys_info) Thanks!