Closed constantinidan closed 8 months ago
Hello, just curious if anyone ever had complete success with this? I was recently trying to go all async with a chainlit/autogen implementation but ran into this issue. I tried running with v0.2.0b1 and I got a bit further downstream but then ran into the below. NOTE: the code I am currently playing with and what generated the trace below is coming from the chanlit-autogen cookbook: https://github.com/Chainlit/cookbook/tree/main/pyautogen
Traceback (most recent call last):
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/chainlit/utils.py", line 39, in wrapper
return await user_function(**params_values)
File "main.py", line 122, in on_chat_start
await user_proxy.a_initiate_chat(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 577, in a_initiate_chat
await self.a_send(self.generate_init_message(**context), recipient, silent=silent)
File "main.py", line 94, in a_send
await super(ChainlitUserProxyAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 403, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 523, in a_receive
await self.a_send(reply, sender, silent=silent)
File "main.py", line 41, in a_send
await super(ChainlitAssistantAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 403, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 523, in a_receive
await self.a_send(reply, sender, silent=silent)
File "main.py", line 94, in a_send
await super(ChainlitUserProxyAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 401, in a_send
valid = self._append_oai_message(message, "assistant", recipient)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 295, in _append_oai_message
message = self._message_to_dict(message)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 277, in _message_to_dict
return dict(message)
TypeError: 'coroutine' object is not iterable
Could you try the latest version? v0.2.15.
Hello @sonichi, I upgraded to v0.2.15 and again using the autogen example from the chainlit cookbook with human_input_mode="ALWAYS", I am running into this the first time the user_proxy asks for feedback. If I simply return and go with auto_reply, the following message gets created:
{'role': 'user', 'content': <coroutine object ChainlitUserProxyAgent.get_human_input at 0x10b975cc0>}
That message ultimately ends up in autogen/code_utils and fails at line 65 because of the coroutine type:
if content is None:
return ""
if isinstance(content, str):
return content
if not isinstance(content, list):
raise TypeError(f"content must be None, str, or list, but got {type(content)}")
Here is the complete stack trace:
Traceback (most recent call last):
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/chainlit/utils.py", line 39, in wrapper
return await user_function(**params_values)
File "main.py", line 135, in on_chat_start
await user_proxy.a_initiate_chat(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 894, in a_initiate_chat
await self.a_send(await self.a_generate_init_message(**context), recipient, silent=silent)
File "main.py", line 103, in a_send
await super(ChainlitUserProxyAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 731, in a_receive
await self.a_send(reply, sender, silent=silent)
File "main.py", line 45, in a_send
await super(ChainlitAssistantAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 731, in a_receive
await self.a_send(reply, sender, silent=silent)
File "main.py", line 103, in a_send
await super(ChainlitUserProxyAgent, self).a_send(
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 583, in a_send
await recipient.a_receive(message, self, request_reply, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 726, in a_receive
self._process_received_message(message, sender, silent)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 657, in _process_received_message
self._print_received_message(message, sender)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/agentchat/conversable_agent.py", line 619, in _print_received_message
print(content_str(content), flush=True)
File "/Users/carey/PycharmProjects/chainlit-async/venv/lib/python3.9/site-packages/autogen/code_utils.py", line 65, in content_str
raise TypeError(f"content must be None, str, or list, but got {type(content)}")
TypeError: content must be None, str, or list, but got <class 'coroutine'>
I am continuing to look into this but if anything comes to mind for you, please let me know. Thanks!
Hello @sonichi, I finally found some time to look into this further and the issue is solely with the chainlit cookbook code and autogen async is working perfectly. I will open a PR on the chainlit side to resolve the issue. Thanks for the work you do with autogen, it is truly exciting to work with these capabilities!
@my3sons @constantinidan can we close this issue?
Hello @julianakiseleva , from my perspective, this issue can be closed.
Hi!
Ideally the
get_human_input
should also have the ability to be async. Right now, it seems impossible to have a code that is fully async because of this.To do that, there is a need to async the function
check_termination_and_human_reply
and await the three calls toget_human_input
Thanks