langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
94.61k stars 15.31k forks source link

Upgrade to v :0.0.306 - AttributeError: Could not find: llms #11306

Closed gautam-goudar closed 6 months ago

gautam-goudar commented 1 year ago

System Info

v: 0.0306 python version: Python 3.11.4

Who can help?

No response

Information

Related Components

Reproduction

Recently upgraded to version 0.0.306 of langchainversion. (from 0.0.259)

I see to run into the below error

2023-10-02T19:42:20.279102248Z   File "/home/appuser/.local/lib/python3.11/site-packages/langchain/__init__.py", line 322, in __getattr__
2023-10-02T19:42:20.279258947Z     raise AttributeError(f"Could not find: {name}")
2023-10-02T19:42:20.279296147Z AttributeError: Could not find: llms

I have no idea as to what is causing this and from where the call is being made. I know that the error is coming from the below file (line 328) https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/__init__.py

I am not able find any attribute called "llms" that is being set in any of our code base - hence more confused about the Attribute error.

Expected behavior

dosubot[bot] commented 1 year ago

🤖

Hello,

Thank you for reaching out and providing detailed information about the issue you're facing. Based on the context provided, it seems like the error "AttributeError: Could not find: llms" is being raised when LangChain is trying to access an attribute named "llms" which doesn't exist.

There are a few possible reasons for this error:

  1. There might be a typo in the LangChain framework where "llms" is used instead of a valid attribute or class name.
  2. The error might be coming from a different part of the LangChain framework not included in the context provided.
  3. If the LangChain framework is using dynamic attribute access (like getattr or setattr), the attribute name could be generated at runtime, and the error might be due to a bug in that logic.
  4. The error might be caused by an issue in the upgrade process from version 0.0.259 to 0.0.306. If the upgrade was not done correctly, it could have left the system in an inconsistent state.

I would recommend checking the entire LangChain framework for any reference to "llms", ensuring that the upgrade was done correctly, and that all dependencies are compatible with the new version of LangChain.

In a similar issue found in the LangChain repository, the solution was to update Python to 3.10, install the latest version of LangChain (not 0.0.40), and install llama-index. You can find more details about this solution here.

Unfortunately, I couldn't find specific changes made in the LangChain framework from version 0.0.259 to 0.0.306 that could have caused this error. It would be helpful if you could provide more information about the code that's causing this error, or any changes you made in your scripts when you upgraded LangChain.

I hope this helps! Please let me know if you have any other questions or need further clarification.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

vladmir-bc commented 9 months ago

System Info

v: 0.0306 python version: Python 3.11.4

Who can help?

No response

Information

  • [ ] The official example notebooks/scripts
  • [x] My own modified scripts

Related Components

  • [x] LLMs/Chat Models
  • [ ] Embedding Models
  • [ ] Prompts / Prompt Templates / Prompt Selectors
  • [ ] Output Parsers
  • [ ] Document Loaders
  • [ ] Vector Stores / Retrievers
  • [ ] Memory
  • [ ] Agents / Agent Executors
  • [x] Tools / Toolkits
  • [ ] Chains
  • [ ] Callbacks/Tracing
  • [ ] Async

Reproduction

Recently upgraded to version 0.0.306 of langchainversion. (from 0.0.259)

I see to run into the below error

2023-10-02T19:42:20.279102248Z   File "/home/appuser/.local/lib/python3.11/site-packages/langchain/__init__.py", line 322, in __getattr__
2023-10-02T19:42:20.279258947Z     raise AttributeError(f"Could not find: {name}")
2023-10-02T19:42:20.279296147Z AttributeError: Could not find: llms

I have no idea as to what is causing this and from where the call is being made. I know that the error is coming from the below file (line 328) https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/__init__.py

I am not able find any attribute called "llms" that is being set in any of our code base - hence more confused about the Attribute error.

Expected behavior

  • Perhaps a stack trace that shows the flow of calls
  • Documentation that shows the possibilities of the error

I found that importing the following helped resolve my issue: from langchain.retrievers.document_compressors import LLMChainExtractor

It's a bit strange since I don't explicitly use LLMChainExtractor in my code. However, after including this import, the error raise AttributeError(f"Could not find: {name}") AttributeError: Could not find: llms stopped occurring. Perhaps this import indirectly resolves a dependency or initializes something needed for your setup. Hopefully, this might help you as well."

Or you have another option - to use from langchain_core.language_models.llms import LLM instead of langchain.llms.base.LLM