Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
20.62k stars 1.9k forks source link

Uncaught ModuleNotFoundError in openai.py line 15 #1531

Closed rdmayo21 closed 4 weeks ago

rdmayo21 commented 1 month ago

Aider version: 0.56.0 Python version: 3.12.5 Platform: macOS-14.6.1-arm64-arm-64bit Python implementation: CPython Virtual environment: Yes OS: Darwin 23.6.0 (64bit) Git version: git version 2.46.0

An uncaught exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1124, in send_message
    yield from self.send(messages, functions=self.functions)
  File "base_coder.py", line 1396, in send
    hash_object, completion = send_completion(
                              ^^^^^^^^^^^^^^^^
  File "sendchat.py", line 86, in send_completion
    res = litellm.completion(**kwargs)
          ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 28, in <module>
    from litellm.types.utils import all_litellm_params
  File "utils.py", line 12, in <module>
    from .llms.openai import ChatCompletionToolCallChunk, ChatCompletionUsageBlock
  File "openai.py", line 15, in <module>
    from openai._legacy_response import HttpxBinaryResponseContent
ModuleNotFoundError: No module named 'openai._legacy_response'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 698, in main
    coder.run()
  File "base_coder.py", line 735, in run
    self.run_one(user_message, preproc)
  File "base_coder.py", line 778, in run_one
    list(self.send_message(message))
  File "base_coder.py", line 1126, in send_message
    except retry_exceptions() as err:
           ^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 24, in retry_exceptions
    litellm.exceptions.APIConnectionError,
    ^^^^^^^^^^^^^^^^^^
  File "llm.py", line 23, in __getattr__
    self._load_litellm()
  File "llm.py", line 30, in _load_litellm
    self._lazy_module = importlib.import_module("litellm")
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 995, in exec_module
  File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
  File "__init__.py", line 9, in <module>
    from litellm.caching import Cache
  File "caching.py", line 28, in <module>
    from litellm.types.utils import all_litellm_params
  File "utils.py", line 12, in <module>
    from .llms.openai import ChatCompletionToolCallChunk, ChatCompletionUsageBlock
  File "openai.py", line 15, in <module>
    from openai._legacy_response import HttpxBinaryResponseContent
ModuleNotFoundError: No module named 'openai._legacy_response'
rdmayo21 commented 1 month ago

Actually, this is the same thing as before: Aider adds old version of openai to requirements.txt, obviously because it doesn't know about newer versions. Maybe it'd be good to put in a warning to the user that the versions might not be the latest?

fry69 commented 1 month ago

Thank you for filing this issue.

aider needs the specific versions it requires, please do not add other versions, it will make aider crash.

Please install aider separately from all other Python modules. At best use pipx to let it manage the aider venv for you.

paul-gauthier commented 4 weeks ago

This looks like a duplicate of #1451, so I'm going to close it so discussion can happen there. Please let me know if you think it's actually a distinct issue.