langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
92.28k stars 14.74k forks source link

OpenLLM cant connect to local server with error: "module 'openllm' has no attribute 'client'" #25772

Open SmolPandaDev opened 2 weeks ago

SmolPandaDev commented 2 weeks ago

Checked other resources

Example Code

Code can be found here: https://github.com/SmolPandaDev/openllm-issue

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "/Users/work/learning/openllm-issue/main.py", line 3, in <module>
    llm = OpenLLM(server_url='http://localhost:3000')
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/langchain_community/llms/openllm.py", line 149, in __init__
    openllm.client.HTTPClient
    ^^^^^^^^^^^^^^
AttributeError: module 'openllm' has no attribute 'client'

Description

I'm trying to get langchain to use a local LLM model so I can ask it questions and receive answers

System Info

Output of pip3 freeze | grep langchain:

langchain==0.2.13
langchain-community==0.2.12
langchain-core==0.2.30
langchain-openai==0.1.21
langchain-text-splitters==0.2.2
langchainhub==0.1.21

Platform: MacOS (14.5 (23F79)) Output of: python3 --version Python 3.11.6

bhardwaj-vipul commented 2 weeks ago

Looks related to this: https://github.com/langchain-ai/langchain/pull/12968 but this PR is closed. Maybe try with a different version of openllm.