AIAnytime / Chat-with-PDF-Chatbot

This Chatbot is an interactive app developed to assist users to interact with their PDF. It is built using Open Source Stack. No OpenAI is required.
MIT License
144 stars 110 forks source link

TypeError: _process_class() missing 4 required positional arguments: 'match_args', 'kw_only', 'slots', and 'weakref_slot' #15

Open vishnusureshperumbavoor opened 8 months ago

vishnusureshperumbavoor commented 8 months ago

While importing :

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

Gives error:

File "/opt/conda/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 552, in _run_script
    exec(code, module.__dict__)
File "/workspaces/Chat-with-PDF-Chatbot/chatbot_app.py", line 5, in <module>
    from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
File "/opt/conda/lib/python3.11/site-packages/transformers/__init__.py", line 26, in <module>
    from . import dependency_versions_check
File "/opt/conda/lib/python3.11/site-packages/transformers/dependency_versions_check.py", line 16, in <module>
    from .utils.versions import require_version, require_version_core
File "/opt/conda/lib/python3.11/site-packages/transformers/utils/__init__.py", line 60, in <module>
    from .hub import (
File "/opt/conda/lib/python3.11/site-packages/transformers/utils/hub.py", line 32, in <module>
    from huggingface_hub import (
File "<frozen importlib._bootstrap>", line 1229, in _handle_fromlist
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/__init__.py", line 370, in __getattr__
    submod = importlib.import_module(submod_path)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/hf_api.py", line 62, in <module>
    from ._inference_endpoints import InferenceEndpoint, InferenceEndpointType
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/_inference_endpoints.py", line 7, in <module>
    from .inference._client import InferenceClient
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/inference/_client.py", line 56, in <module>
    from huggingface_hub.inference._common import (
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/inference/_common.py", line 51, in <module>
    from ._text_generation import TextGenerationStreamResponse, _parse_text_generation_error
File "/opt/conda/lib/python3.11/site-packages/huggingface_hub/inference/_text_generation.py", line 59, in <module>
    @dataclass
     ^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/pydantic/dataclasses.py", line 137, in dataclass
    return wrap(_cls)
           ^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/pydantic/dataclasses.py", line 132, in wrap
    return _process_class(cls, init, repr, eq, order, unsafe_hash, frozen, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/conda/lib/python3.11/site-packages/pydantic/dataclasses.py", line 84, in _process_class
    cls = dataclasses._process_class(_cls, init, repr, eq, order, unsafe_hash, frozen)  # type: ignore
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
vishnusureshperumbavoor commented 8 months ago

Previously I downgraded pydantic to pydantic==1 because of pydantic import error