Aider-AI / aider

aider is AI pair programming in your terminal
https://aider.chat/
Apache License 2.0
20.87k stars 1.93k forks source link

Uncaught AttributeError in sendchat.py line 30 #2144

Closed introvenk closed 3 hours ago

introvenk commented 4 hours ago

Aider version: 0.59.1 Python version: 3.9.20 Platform: Linux-6.8.0-47-generic-x86_64-with-glibc2.39 Python implementation: CPython Virtual environment: Yes OS: Linux 6.8.0-47-generic (64bit) Git version: git version 2.43.0

An uncaught exception occurred:

Traceback (most recent call last):
  File "utils.py", line 7428, in chunk_creator
    response_obj = self.handle_ollama_stream(chunk)
  File "utils.py", line 6971, in handle_ollama_stream
    raise e
  File "utils.py", line 6946, in handle_ollama_stream
    raise Exception(f"Ollama Error - {json_chunk}")
Exception: Ollama Error - {'error': 'an unknown error was encountered while running the model '}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "base_coder.py", line 1129, in send_message
    yield from self.send(messages, functions=self.functions)
  File "base_coder.py", line 1425, in send
    yield from self.show_send_output_stream(completion)
  File "base_coder.py", line 1492, in show_send_output_stream
    for chunk in completion:
  File "ollama.py", line 376, in ollama_completion_stream
    raise e
  File "ollama.py", line 373, in ollama_completion_stream
    for transformed_chunk in streamwrapper:
  File "utils.py", line 8012, in __next__
    raise exception_type(
  File "utils.py", line 7924, in __next__
    response: Optional[ModelResponse] = self.chunk_creator(chunk=chunk)
  File "utils.py", line 7827, in chunk_creator
    raise exception_type(
  File "exception_mapping_utils.py", line 2116, in exception_type
    raise e
  File "exception_mapping_utils.py", line 2092, in exception_type
    raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: Ollama Error - {'error': 'an unknown error was encountered while running the model '}
Traceback (most recent call last):
  File "utils.py", line 7428, in chunk_creator
    response_obj = self.handle_ollama_stream(chunk)
  File "utils.py", line 6971, in handle_ollama_stream
    raise e
  File "utils.py", line 6946, in handle_ollama_stream
    raise Exception(f"Ollama Error - {json_chunk}")
Exception: Ollama Error - {'error': 'an unknown error was encountered while running the model '}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "aider", line 8, in <module>
    sys.exit(main())
  File "main.py", line 757, in main
    coder.run()
  File "base_coder.py", line 730, in run
    self.run_one(user_message, preproc)
  File "base_coder.py", line 773, in run_one
    list(self.send_message(message))
  File "base_coder.py", line 1131, in send_message
    except retry_exceptions() as err:
  File "sendchat.py", line 30, in retry_exceptions
    litellm.llms.anthropic.chat.AnthropicError,
AttributeError: module 'litellm.llms.anthropic.chat' has no attribute 'AnthropicError'
paul-gauthier commented 3 hours ago

Thanks for trying aider and filing this issue.

This looks like a duplicate of #1414. Please see the comments there for more information, and feel free to continue the discussion within that issue.

I'm going to close this issue for now. But please let me know if you think this is actually a distinct issue and I will reopen this issue.