garyfeng / Auto-GPT

An experimental open-source attempt to make GPT-4 fully autonomous.
MIT License
0 stars 0 forks source link

get_ada_embedding(data) should handle API connection errors gracefully #6

Open garyfeng opened 1 year ago

garyfeng commented 1 year ago

Duplicates

Steps to reproduce 🕹

Not sure if this can be replicated -- only happens if call to OpanAI API fails.

Current behavior 😯

The agent aborts, and the docker aborts. Last error message is something like the following:

2023-04-17 22:40:24 The above exception was the direct cause of the following exception:
2023-04-17 22:40:24 
2023-04-17 22:40:24 Traceback (most recent call last):
2023-04-17 22:40:24   File "/app/main.py", line 423, in <module>
2023-04-17 22:40:24     memory.add(memory_to_add)
2023-04-17 22:40:24   File "/app/memory/redismem.py", line 74, in add
2023-04-17 22:40:24     vector = get_ada_embedding(data)
2023-04-17 22:40:24              ^^^^^^^^^^^^^^^^^^^^^^^
2023-04-17 22:40:24   File "/app/memory/base.py", line 13, in get_ada_embedding
2023-04-17 22:40:24     return openai.Embedding.create(input=[text], model="text-embedding-ada-002")["data"][0]["embedding"]
2023-04-17 22:40:24            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2023-04-17 22:40:24   File "/usr/local/lib/python3.11/site-packages/openai/api_resources/embedding.py", line 33, in create
2023-04-17 22:40:24     response = super().create(*args, **kwargs)
2023-04-17 22:40:24                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2023-04-17 22:40:24   File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
2023-04-17 22:40:24     response, _, api_key = requestor.request(
2023-04-17 22:40:24                            ^^^^^^^^^^^^^^^^^^
2023-04-17 22:40:24   File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 216, in request
2023-04-17 22:40:24     result = self.request_raw(
2023-04-17 22:40:24              ^^^^^^^^^^^^^^^^^
2023-04-17 22:40:24   File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 528, in request_raw
2023-04-17 22:40:24     raise error.APIConnectionError(
2023-04-17 22:40:24 openai.error.APIConnectionError: Error communicating with OpenAI: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

Expected behavior 🤔

The function get_ada_embedding(data) should handle unexpected errors gracefully. In this case, we may retry, or return something that allows the program to continue; basically GENERATE NEXT COMMAND.

Your prompt 📝

# Paste your prompt here
garyfeng commented 1 year ago

Another failure due to a similar problem with OpenAI API, but with a different function. In this case, the function in llm_utils.py below does not have any error handling.

# Overly simple abstraction until we create something better
def create_chat_completion(messages, model=None, temperature=None, max_tokens=None)->str:
    """Create a chat completion using the OpenAI API"""
    if cfg.use_azure:
        response = openai.ChatCompletion.create(
            deployment_id=cfg.azure_chat_deployment_id,
            model=model,
            messages=messages,
            temperature=temperature,
            max_tokens=max_tokens
        )
    else:
        response = openai.ChatCompletion.create(
            model=model,
            messages=messages,
            temperature=temperature,
            max_tokens=max_tokens
        )

    return response.choices[0].message["content"]

The error log.

  ----------- END OF CONTEXT ----------------
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 672, in _interpret_response_line
    data = json.loads(rbody)
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/main.py", line 343, in <module>
    assistant_reply = chat.chat_with_ai(
                      ^^^^^^^^^^^^^^^^^^
  File "/app/chat.py", line 125, in chat_with_ai
    assistant_reply = create_chat_completion(
                      ^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/llm_utils.py", line 19, in create_chat_completion
    response = openai.ChatCompletion.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
                           ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 226, in request
    resp, got_stream = self._interpret_response(result, stream)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 619, in _interpret_response
    self._interpret_response_line(
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 674, in _interpret_response_line
    raise error.APIError(
openai.error.APIError: HTTP code 502 from API (<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>cloudflare</center>
</body>
</html>
)
garyfeng commented 1 year ago

the original problem is with memory/base.py:

def get_ada_embedding(text):
    text = text.replace("\n", " ")
    if cfg.use_azure:
        return openai.Embedding.create(input=[text], engine=cfg.azure_embeddigs_deployment_id, model="text-embedding-ada-002")["data"][0]["embedding"]
    else:
        return openai.Embedding.create(input=[text], model="text-embedding-ada-002")["data"][0]["embedding"]
garyfeng commented 1 year ago

Yet another:

  ----------- END OF CONTEXT ----------------
Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 672, in _interpret_response_line
    data = json.loads(rbody)
           ^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/main.py", line 343, in <module>
    assistant_reply = chat.chat_with_ai(
                      ^^^^^^^^^^^^^^^^^^
  File "/app/chat.py", line 125, in chat_with_ai
    assistant_reply = create_chat_completion(
                      ^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/llm_utils.py", line 19, in create_chat_completion
    response = openai.ChatCompletion.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
                           ^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 226, in request
    resp, got_stream = self._interpret_response(result, stream)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 619, in _interpret_response
    self._interpret_response_line(
  File "/usr/local/lib/python3.11/site-packages/openai/api_requestor.py", line 674, in _interpret_response_line
    raise error.APIError(
openai.error.APIError: HTTP code 502 from API (<html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>cloudflare</center>
</body>
</html>
)