Traceback (most recent call last):
File "/Users/work/learning/openllm-issue/main.py", line 3, in <module>
llm = OpenLLM(server_url='http://localhost:3000')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/langchain_community/llms/openllm.py", line 149, in __init__
openllm.client.HTTPClient
^^^^^^^^^^^^^^
AttributeError: module 'openllm' has no attribute 'client'
Description
I'm trying to get langchain to use a local LLM model so I can ask it questions and receive answers
Checked other resources
Example Code
Code can be found here: https://github.com/SmolPandaDev/openllm-issue
Error Message and Stack Trace (if applicable)
Description
I'm trying to get langchain to use a local LLM model so I can ask it questions and receive answers
System Info
Output of
pip3 freeze | grep langchain
:Platform: MacOS (14.5 (23F79)) Output of: python3 --version Python 3.11.6