paul-gauthier / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
19.11k stars 1.76k forks source link

01-mini via Openroute and Uncaught AttributeError in sendchat.py line 30 #1637

Closed transcendr closed 2 days ago

transcendr commented 2 days ago

Just updated to 0.56.0, now trying to start aider with o1-mini via openrouter I get this message:

 aider --model openrouter/openai/o1-mini ───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────── Warning for openrouter/openai/gpt-4o-mini: Unknown context window size and costs, using sane defaults. Did you mean one of these?

and when it tries to edit a file, I get the following error. Not sure why it's referencing anthropic, I'm not using claude.

Aider version: 0.56.0 Python version: 3.11.5 Platform: macOS-14.5-arm64-arm-64bit Python implementation: CPython Virtual environment: No OS: Darwin 23.5.0 (64bit) Git version: git version 2.39.2

An uncaught exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1124, in send_message
    continue
  File "base_coder.py", line 1413, in send
    format_content("ASSISTANT", self.partial_response_content),
    ^^^^^^^^^
  File "base_coder.py", line 1408, in send
    self.keyboard_interrupt()
        ^^^^^^^^^^^^^^^^^^^^^^
  File "base_coder.py", line 1482, in show_send_output_stream
    func = chunk.choices[0].delta.function_call
  File "utils.py", line 10022, in __next__
    chunk = next(self.completion_stream)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "_streaming.py", line 43, in __next__
    return self._iterator.__next__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "_streaming.py", line 58, in __stream__
    for sse in iterator:
  File "_streaming.py", line 50, in _iter_events
    yield from self._decoder.iter_bytes(self.response.iter_bytes())
  File "_streaming.py", line 280, in iter_bytes
    for chunk in self._iter_chunks(iterator):
  File "_streaming.py", line 291, in _iter_chunks
    for chunk in iterator:
  File "_models.py", line 831, in iter_bytes
    for raw_bytes in self.iter_raw():
  File "_models.py", line 885, in iter_raw
    for raw_stream_bytes in self.stream:
  File "_client.py", line 127, in __iter__
    for chunk in self._stream:
  File "default.py", line 116, in __iter__
    for part in self._httpcore_stream:
  File "connection_pool.py", line 367, in __iter__
    raise exc from None
  File "connection_pool.py", line 363, in __iter__
    for part in self._stream:
  File "http11.py", line 349, in __iter__
    raise exc
  File "http11.py", line 341, in __iter__
    for chunk in self._connection._receive_response_body(**kwargs):
  File "http11.py", line 210, in _receive_response_body
    event = self._receive_event(timeout=timeout)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "http11.py", line 224, in _receive_event
    data = self._network_stream.read(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "sync.py", line 126, in read
    return self._sock.recv(max_bytes)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "ssl.py", line 1296, in recv
    return self.read(buflen)
           ^^^^^^^^^^^^^^^^^
  File "ssl.py", line 1169, in read
    return self._sslobj.read(len)
           ^^^^^^^^^^^^^^^^^^^^^^
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 698, in main
    return 1
  File "base_coder.py", line 735, in run
    return self.io.get_input(
                ^^^^^^^^^^^^^^
  File "base_coder.py", line 778, in run_one
    def check_for_urls(self, inp):
            ^^^^^^^^^^^^^^^^^^^^^^^
  File "base_coder.py", line 1126, in send_message
    interrupted = True
       ^^^^^^^^^^^^^^^^
  File "sendchat.py", line 30, in retry_exceptions
    litellm.llms.anthropic.chat.AnthropicError,
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'litellm.llms.anthropic' has no attribute 'AnthropicError'
fry69 commented 2 days ago

Thank you for filing this issue.

The fix is in the latest version (0.57.1), please update aider with

aider --upgrade

After updating you can suppress those warnings when using OpenRouter with starting aider with the additional command line argument --no-show-model-warnings

transcendr commented 2 days ago

ok thanks! confirmed working in 0.57.1. I wasn't aware of that version because the auto update said the new version was 0.56.0

fry69 commented 2 days ago

The version checker caches the version for one day and the new version is not old enough yet :)

paul-gauthier commented 2 days ago

I'm going to close this issue for now, but feel free to add a comment here and I will re-open. Or feel free to file a new issue any time.