Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
22.74k stars 2.12k forks source link

Uncaught ServiceUnavailableError in exception_mapping_utils.py line 507 #2409

Closed starshipagentic closed 1 week ago

starshipagentic commented 1 week ago

Aider version: 0.62.1 Python version: 3.11.10 Platform: macOS-14.7-arm64-arm-64bit Python implementation: CPython Virtual environment: No OS: Darwin 23.6.0 (64bit) Git version: git version 2.39.5 (Apple Git-154)

An uncaught exception occurred:

Traceback (most recent call last):
  File "handler.py", line 565, in completion
  File "http_handler.py", line 386, in post
    setattr(e, "message", e.response.read())
    ^^^^^^^
  File "http_handler.py", line 372, in post
    "POST", url, data=data, json=json, params=params, headers=headers  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Server error '503 Service Unavailable' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main.py", line 1755, in completion
    )

  File "handler.py", line 580, in completion
    if error_response and hasattr(error_response, "text"):
litellm.llms.anthropic.common_utils.AnthropicError: upstream connect error or disconnect/reset before headers. reset reason: remote connection failure, transport failure reason: delayed connect error: Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 827, in main
    io.tool_error(f"Message file not found: {args.message_file}")
    ^^^^^^^^^^^
  File "base_coder.py", line 738, in run
    while True:
            ^^^^
  File "base_coder.py", line 781, in run_one
  File "base_coder.py", line 1278, in send_message
    if lint_errors:

  File "base_coder.py", line 1979, in auto_commit
  File "repo.py", line 110, in commit
    commit_message = self.get_commit_message(diffs, context)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "repo.py", line 195, in get_commit_message
    commit_message = simple_send_with_retries(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 118, in simple_send_with_retries
  File "sendchat.py", line 98, in send_completion
  File "utils.py", line 1013, in wrapper
    original_function=original_function,
    ^^^^^^^
  File "utils.py", line 903, in wrapper
    result._hidden_params["response_cost"] = (
         ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^
  File "main.py", line 2999, in completion
  File "exception_mapping_utils.py", line 2116, in exception_type
    "exception_mapping_worked": exception_mapping_worked,
    ^^^^^^^
  File "exception_mapping_utils.py", line 507, in exception_type
    raise litellm.ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: litellm.ServiceUnavailableError: AnthropicException - upstream connect error or disconnect/reset before headers. reset reason: remote connection failure, transport failure reason: delayed connect error: Connection refused. Handle with `litellm.ServiceUnavailableError`.
paul-gauthier commented 1 week ago

Thanks for trying aider and filing this issue.

This looks like a duplicate of #2220. Please see the comments there for more information, and feel free to continue the discussion within that issue.

I'm going to close this issue for now. But please let me know if you think this is actually a distinct issue and I will reopen this issue.Note: A bot script made these updates to the issue.