Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
22.74k stars 2.12k forks source link

Uncaught ContextWindowExceededError in exception_mapping_utils.py line 427 #2364

Closed cschubiner closed 2 weeks ago

cschubiner commented 2 weeks ago

Aider version: 0.62.1 Python version: 3.12.7 Platform: macOS-15.0-arm64-arm-64bit Python implementation: CPython Virtual environment: Yes OS: Darwin 24.0.0 (64bit) Git version: git version 2.39.5 (Apple Git-154)

An uncaught exception occurred:

Traceback (most recent call last):
  File "handler.py", line 565, in completion
    response = client.post(
               ^^^^^^^^^^^^
  File "http_handler.py", line 386, in post
    raise e
  File "http_handler.py", line 372, in post
    response.raise_for_status()
  File "_models.py", line 763, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://api.anthropic.com/v1/messages'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "main.py", line 1755, in completion
    response = anthropic_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "handler.py", line 580, in completion
    raise AnthropicError(
litellm.llms.anthropic.common_utils.AnthropicError: {"type":"error","error":{"type":"invalid_request_error","message":"prompt is too long: 207889 tokens > 200000 maximum"}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "main.py", line 757, in main
    coder.commands.cmd_commit()
  File "commands.py", line 252, in cmd_commit
    self.raw_cmd_commit(args)
  File "commands.py", line 266, in raw_cmd_commit
    self.coder.repo.commit(message=commit_message)
  File "repo.py", line 110, in commit
    commit_message = self.get_commit_message(diffs, context)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "repo.py", line 195, in get_commit_message
    commit_message = simple_send_with_retries(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 118, in simple_send_with_retries
    _hash, response = send_completion(**kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "sendchat.py", line 98, in send_completion
    res = litellm.completion(**kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "utils.py", line 1013, in wrapper
    raise e
  File "utils.py", line 903, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "main.py", line 2999, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "exception_mapping_utils.py", line 2116, in exception_type
    raise e
  File "exception_mapping_utils.py", line 427, in exception_type
    raise ContextWindowExceededError(
litellm.exceptions.ContextWindowExceededError: litellm.BadRequestError: litellm.ContextWindowExceededError: AnthropicError - {"type":"error","error":{"type":"invalid_request_error","message":"prompt is too long: 207889 tokens > 200000 maximum"}}
paul-gauthier commented 2 weeks ago

Thanks for trying aider and filing this issue.

This looks like a duplicate of #2240. Please see the comments there for more information, and feel free to continue the discussion within that issue.

I'm going to close this issue for now. But please let me know if you think this is actually a distinct issue and I will reopen this issue.