Closed tituslhy closed 1 month ago
I've narrowed it down. The coroutine is the async def get_human_input method in the ChainlitUserProxyAgent itself. Still can't fix it though. When I removed the "async". Now I just get "Provide feedback to chat_manager. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:" showing up in my chainlit panel but when I type a reply I get a problem with the last line of the get_human_input method because it's a string
2024-07-26 16:29:29 - 'content'
Traceback (most recent call last):
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/chainlit/utils.py", line 44, in wrapper
return await user_function(**params_values)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Projects/AI-Sandbox/llm/experiments/autogen/groupchat/basicapp.py", line 62, in on_message
await cl.make_async(user_proxy.initiate_chat)( manager, message=message, )
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/asyncer/_main.py", line 358, in wrapper
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/anyio/to_thread.py", line 33, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/asyncio/futures.py", line 287, in __await__
yield self # This tells Task to wait for completion.
^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup
future.result()
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/asyncio/futures.py", line 203, in result
raise self._exception.with_traceback(self._exception_tb)
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 807, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1018, in initiate_chat
self.send(msg2send, recipient, silent=silent)
File "/App/tlim2/Projects/AI-Sandbox/llm/experiments/autogen/groupchat/utils.py", line 107, in send
super(ChainlitUserProxyAgent, self).send(
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 655, in send
recipient.receive(message, self, request_reply, silent)
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 818, in receive
reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/groupchat.py", line 1052, in run_chat
reply = speaker.generate_reply(sender=self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1972, in generate_reply
final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Anaconda3/envs/llamaindex/lib/python3.12/site-packages/autogen/agentchat/conversable_agent.py", line 1724, in check_termination_and_human_reply
reply = self.get_human_input(
^^^^^^^^^^^^^^^^^^^^^
File "/App/tlim2/Projects/AI-Sandbox/llm/experiments/autogen/groupchat/utils.py", line 92, in get_human_input
return reply["content"].strip()
~~~~~^^^^^^^^^^^
KeyError: 'content'
Describe the bug When using autogen in chainlit, the current cookbook inherits autogen's UserProxyAgent and ConversableAgents and amend their send() methods.
Using the latest versions of both librares (chainlit==1.1.306, pyautogen==0.2.32), this only works for single agent-agent conversations. When using a groupchat this fails because the user_proxy_agent is trying to send a coroutine instead of a message.
To Reproduce The Error app.py
The error is when the ChainlitUserProxyAgent tries to send a message because the message happens to be of type
coroutine
Traceback:
I'm not too sure why it's a coroutine when it's on chainlit but this works fine on Jupyter Notebooks.