Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
20.95k stars 1.94k forks source link

Uncaught ModuleNotFoundError in caching.py line 22 #2020

Closed mcoolidge closed 2 weeks ago

mcoolidge commented 2 weeks ago

Aider version: 0.59.1 Python version: 3.10.10 Platform: macOS-15.0.1-arm64-arm-64bit Python implementation: CPython Virtual environment: No OS: Darwin 24.0.0 (64bit) Git version: git version 2.43.0

An uncaught exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1129, in send_message
    yield from self.send(messages, functions=self.functions)
  File "base_coder.py", line 1414, in send
    hash_object, completion = send_completion(
  File "sendchat.py", line 83, in send_completion
    res = litellm.completion(**kwargs)
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
  File "__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 22, in <module>
    from openai._models import BaseModel as OpenAIObject
ModuleNotFoundError: No module named 'openai._models'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
  File "main.py", line 757, in main
    coder.run()
  File "base_coder.py", line 730, in run
    self.run_one(user_message, preproc)
  File "base_coder.py", line 773, in run_one
    list(self.send_message(message))
  File "base_coder.py", line 1131, in send_message
    except retry_exceptions() as err:
  File "sendchat.py", line 24, in retry_exceptions
    litellm.exceptions.APIConnectionError,
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
  File "__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 22, in <module>
    from openai._models import BaseModel as OpenAIObject
ModuleNotFoundError: No module named 'openai._models'
fry69 commented 2 weeks ago

Thank you for filing this issue.

Can you please try to install or reinstall aider in a separate Python environment? Either with venv or pipx?

Please remove the existing environment/aider installation first, e.g. with pipx:

$ pipx uninstall aider-chat
uninstalled aider-chat! ✨ 🌟 ✨
$ pipx install aider-chat
  installed package aider-chat 0.59.1, installed using Python 3.12.7
  These apps are now globally available
    - aider
done! ✨ 🌟 ✨

This document may be helpful -> https://aider.chat/docs/troubleshooting/imports.html

mcoolidge commented 2 weeks ago

It appears there were conflicting versions installed potentially via homebrew and pipx. Removing all of them and reinstalling with pipx worked as expected.

Thanks for the help and, more importantly, for the amazing tool :)