paul-gauthier / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
18.14k stars 1.69k forks source link

Uncaught ModuleNotFoundError in caching.py line 22 #1283

Closed aipropro closed 5 days ago

aipropro commented 2 weeks ago

Aider version: 0.54.8 Python version: 3.12.4 Platform: Windows-11-10.0.22631-SP0 Python implementation: CPython Virtual environment: No OS: Windows 11 (64bit) Git version: git version 2.46.0.windows.1

An uncaught exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1080, in send_message
    yield from self.send(messages, functions=self.functions)
  File "base_coder.py", line 1352, in send
    hash_object, completion = send_completion(
                              ^^^^^^^^^^^^^^^^
  File "sendchat.py", line 86, in send_completion
    res = litellm.completion(**kwargs)
          ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 22, in __getattr__
    self._load_litellm()
  File "llm.py", line 29, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 22, in <module>
    from openai._models import BaseModel as OpenAIObject
ModuleNotFoundError: No module named 'openai._models'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "__main__.py", line 7, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 680, in main
    coder.run()
  File "base_coder.py", line 728, in run
    self.run_one(user_message, preproc)
  File "base_coder.py", line 771, in run_one
    list(self.send_message(message))
  File "base_coder.py", line 1082, in send_message
    except retry_exceptions() as err:
           ^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 24, in retry_exceptions
    litellm.exceptions.APIConnectionError,
    ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 22, in __getattr__
    self._load_litellm()
  File "llm.py", line 29, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 22, in <module>
    from openai._models import BaseModel as OpenAIObject
ModuleNotFoundError: No module named 'openai._models'
paul-gauthier commented 1 week ago

Thanks for trying aider and filing this issue.

How did you install aider? It looks like you don't have the correct set of dependencies installed.

paul-gauthier commented 5 days ago

I'm going to close this issue for now, but feel free to add a comment here and I will re-open. Or feel free to file a new issue any time.