microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
30.78k stars 4.49k forks source link

[Bug]: `Uncaught app exception` should be catched and not crash the app completely #2113

Open clemlesne opened 5 months ago

clemlesne commented 5 months ago

Describe the bug

Time to time, LLM:

The generated error is Uncaught app exception, and the whole app crashes.

Expected Behaviour

Exceptions to be managed, LLM notified and call retried.

Screenshots and logs

Timeout:

2024-03-21 16:09:12.635 Uncaught app exception
Traceback (most recent call last):
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 542, in _run_script
    exec(code, module.__dict__)
  File "xxx/main.py", line 415, in <module>
    asyncio.run(main())
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "xxx/main.py", line 403, in main
    notes, query = await run(semantic_input, agent)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/main.py", line 335, in run
    res = await agent.a_initiate_chat(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1054, in a_initiate_chat
    await self.a_send(msg2send, recipient, silent=silent)
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 679, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 825, in a_receive
    reply = await self.a_generate_reply(sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1937, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 696, in a_run_chat
    reply = await speaker.a_generate_reply(sender=self)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1937, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1323, in a_generate_oai_reply
    return await asyncio.get_event_loop().run_in_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TimeoutError: OpenAI API call timed out. This could be due to congestion or too small a timeout value. The timeout can be specified by setting the 'timeout' value (in seconds) in the llm_config (if you are using agents) or the OpenAIWrapper constructor (if you are using the OpenAIWrapper directly).

HTTP 500:

2024-03-21 17:00:57.324 Uncaught app exception
Traceback (most recent call last):
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 542, in _run_script
    exec(code, module.__dict__)
  File "xxx/main.py", line 401, in <module>
    asyncio.run(main())
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "xxx/main.py", line 389, in main
    notes, query = await run(semantic_input, agent)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/main.py", line 321, in run
    res = await agent.a_initiate_chat(
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1054, in a_initiate_chat
    await self.a_send(msg2send, recipient, silent=silent)
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 679, in a_send
    await recipient.a_receive(message, self, request_reply, silent)
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 825, in a_receive
    reply = await self.a_generate_reply(sender=sender)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1937, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/groupchat.py", line 696, in a_run_chat
    reply = await speaker.a_generate_reply(sender=self)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1937, in a_generate_reply
    final, reply = await reply_func(
                   ^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1323, in a_generate_oai_reply
    return await asyncio.get_event_loop().run_in_executor(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1275, in generate_oai_reply
    extracted_response = self._generate_oai_reply_from_client(
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/agentchat/conversable_agent.py", line 1294, in _generate_oai_reply_from_client
    response = llm_client.create(
               ^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/oai/client.py", line 623, in create
    response = client.create(params)
               ^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/autogen/oai/client.py", line 276, in create
    response = completions.create(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 667, in create
    return self._post(
           ^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 1208, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 897, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 935, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 935, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 1021, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^
  File "xxx/.pyenv/versions/xxx/lib/python3.11/site-packages/openai/_base_client.py", line 988, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The server had an error processing your request. Sorry about that! You can retry your request, or contact us through an Azure support request at: https://go.microsoft.com/fwlink/?linkid=2213926 if you keep seeing this error. (Please include the request ID xxx in your email.)', 'type': 'server_error', 'param': None, 'code': None}}
WaelKarkoub commented 5 months ago

If it does fail, would you recommend that it should retry or return some message like an auto reply? or maybe even have an LLM decide what to do (not sure if this is reliable)