paul-gauthier / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
17.67k stars 1.65k forks source link

Unexpected error: litellm.InternalServerError: AnthropicException - Overloaded #957

Closed ljcremer closed 1 month ago

ljcremer commented 1 month ago

Issue

Aider v0.46.0 Models: claude-3-5-sonnet-20240620 with diff edit format, weak model claude-3-haiku-20240307 Git repo: .git with 280 files Repo-map: using 1024 tokens Use /help for help, run "aider --help" to see cmd line args ─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

/ask Tell me the key functionality of the code

Unexpected error: litellm.InternalServerError: AnthropicException - Overloaded. Handle with litellm.InternalServerError. Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 9950, in next chunk = next(self.completion_stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/anthropic.py", line 1100, in next return self.chunk_parser(chunk=data_json) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/llms/anthropic.py", line 1060, in chunk_parser raise AnthropicError( litellm.llms.anthropic.AnthropicError: Overloaded

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/aider/coders/base_coder.py", line 863, in send_new_user_message yield from self.send(messages, functions=self.functions) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/aider/coders/base_coder.py", line 1124, in send yield from self.show_send_output_stream(completion) File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/aider/coders/base_coder.py", line 1198, in show_send_output_stream for chunk in completion: File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 10025, in next raise exception_type( ^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 8034, in exception_type raise e File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/site-packages/litellm/utils.py", line 6556, in exception_type raise litellm.InternalServerError( litellm.exceptions.InternalServerError: litellm.InternalServerError: AnthropicException - Overloaded. Handle with litellm.InternalServerError.

Version and model info

Aider v0.46.0 Models: claude-3-5-sonnet-20240620 with diff edit format, weak model claude-3-haiku-20240307 Git repo: .git with 280 files Repo-map: using 1024 tokens

paul-gauthier commented 1 month ago

Thanks for trying aider and filing this issue.

That error means that Anthropic's API servers are overloaded. Best thing is to try again in a bit and see if they've recovered.

ljcremer commented 1 month ago

Great thanks

SupraSummus commented 1 month ago

I think aider should somehow handle this exception, because now the user doesn't know what to do. I've just entered a request "please retry" and ai handled that fine. But before that I've searched the issues in this repo, so that indicates that some level of confusion is present ;) Cheers. I love the new /ask feature.

paul-gauthier commented 1 month ago

Good point. I've added litellm.InternalServerError to be retried with exponential backoff.

The change is available in the main branch. You can get it by installing the latest version from github:

python -m pip install --upgrade git+https://github.com/paul-gauthier/aider.git

If you have a chance to try it, let me know if it works better for you.

beat commented 1 month ago

I'm on latest docker version (0.46 iirc), and I get the same "unexpected error" with same line numbers, and the same "During handling of the above exception, another exception occurred". Thank you @SupraSummus for the "please retry" hint, will try the retry next time :-)

beat commented 1 month ago

Fyi, In latest version v0.46.2-dev (from dockerhub, pulled this afternoon, the smaller standard version for now), I still got a cutoff error, like this one:

Unexpected error: litellm.APIConnectionError: Response ended prematurely
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 820, in generate
    yield from self.raw.stream(chunk_size, decode_content=True)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1057, in stream
    yield from self.read_chunked(amt, decode_content=decode_content)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1206, in read_chunked
    self._update_chunk_length()
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1136, in _update_chunk_length
    raise ProtocolError("Response ended prematurely") from None
urllib3.exceptions.ProtocolError: Response ended prematurely

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 9950, in __next__
    chunk = next(self.completion_stream)
  File "/usr/local/lib/python3.10/site-packages/litellm/llms/anthropic.py", line 1084, in __next__
    chunk = self.response_iterator.__next__()
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 869, in iter_lines
    for chunk in self.iter_content(
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 822, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: Response ended prematurely

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 820, in generate
    yield from self.raw.stream(chunk_size, decode_content=True)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1057, in stream
    yield from self.read_chunked(amt, decode_content=decode_content)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1206, in read_chunked
    self._update_chunk_length()
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1136, in _update_chunk_length
    raise ProtocolError("Response ended prematurely") from None
urllib3.exceptions.ProtocolError: Response ended prematurely

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 9950, in __next__
    chunk = next(self.completion_stream)
  File "/usr/local/lib/python3.10/site-packages/litellm/llms/anthropic.py", line 1084, in __next__
    chunk = self.response_iterator.__next__()
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 869, in iter_lines
    for chunk in self.iter_content(
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 822, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: Response ended prematurely

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 865, in send_new_user_message
    yield from self.send(messages, functions=self.functions)
  File "/usr/local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1126, in send
    yield from self.show_send_output_stream(completion)
  File "/usr/local/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1200, in show_send_output_stream
    for chunk in completion:
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 10025, in __next__
    raise exception_type(
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 8034, in exception_type
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 7998, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Response ended prematurely
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 820, in generate
    yield from self.raw.stream(chunk_size, decode_content=True)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1057, in stream
    yield from self.read_chunked(amt, decode_content=decode_content)
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1206, in read_chunked
    self._update_chunk_length()
  File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1136, in _update_chunk_length
    raise ProtocolError("Response ended prematurely") from None
urllib3.exceptions.ProtocolError: Response ended prematurely

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 9950, in __next__
    chunk = next(self.completion_stream)
  File "/usr/local/lib/python3.10/site-packages/litellm/llms/anthropic.py", line 1084, in __next__
    chunk = self.response_iterator.__next__()
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 869, in iter_lines
    for chunk in self.iter_content(
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 822, in generate
    raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: Response ended prematurely
beat commented 1 month ago

Also still got (with v0.47.2 docker full) the overload double-error:

Unexpected error: litellm.InternalServerError: AnthropicException - Overloaded. Handle with `litellm.InternalServerError`.
Traceback (most recent call last):
  File "/venv/lib/python3.10/site-packages/litellm/utils.py", line 9950, in __next__
    chunk = next(self.completion_stream)
  File "/venv/lib/python3.10/site-packages/litellm/llms/anthropic.py", line 1100, in __next__
    return self.chunk_parser(chunk=data_json)
  File "/venv/lib/python3.10/site-packages/litellm/llms/anthropic.py", line 1060, in chunk_parser
    raise AnthropicError(
litellm.llms.anthropic.AnthropicError: Overloaded

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/venv/lib/python3.10/site-packages/aider/coders/base_coder.py", line 865, in send_new_user_message
    yield from self.send(messages, functions=self.functions)
  File "/venv/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1126, in send
    yield from self.show_send_output_stream(completion)
  File "/venv/lib/python3.10/site-packages/aider/coders/base_coder.py", line 1200, in show_send_output_stream
    for chunk in completion:
  File "/venv/lib/python3.10/site-packages/litellm/utils.py", line 10025, in __next__
    raise exception_type(
  File "/venv/lib/python3.10/site-packages/litellm/utils.py", line 8034, in exception_type
    raise e
  File "/venv/lib/python3.10/site-packages/litellm/utils.py", line 6556, in exception_type
    raise litellm.InternalServerError(
litellm.exceptions.InternalServerError: litellm.InternalServerError: AnthropicException - Overloaded. Handle with `litellm.InternalServerError`.
paul-gauthier commented 1 month ago

I found a bug in litellm which is causing this. I implemented a work around and reported it to them.

https://github.com/BerriAI/litellm/issues/5000

The change is available in the main branch. You can get it by installing the latest version from github:

python -m pip install --upgrade git+https://github.com/paul-gauthier/aider.git

If you have a chance to try it, let me know if it works better for you.

paul-gauthier commented 1 month ago

I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time.

glinkot commented 1 month ago

I've gotten this error a few times still, unfortunately:

image

RhydianDowning commented 1 month ago

I am now getting this after recent update (also when i go to older version) I get this is antrhopic, but other services are working with sonnet so I do not know why this is not

AliYous commented 1 month ago

Hi ! I've just installed the latest update and I'm also facing the same error, not sure if there is any way to fix it or if it is solely on Anthropic's side. It just started happening today, never seen this error before.

CleanShot 2024-08-08 at 21 18 50@2x

paul-gauthier commented 1 month ago

These errors are because Anthropic's API servers are overloaded/down. Aider is simply reporting that the server is broken.

Anthropic has been having serious issues since yesterday. You can directly check their server status here:

https://status.anthropic.com/

boosh commented 6 days ago

It might be worth linking to that status page in the error message @paul-gauthier