Chainlit / cookbook

Chainlit's cookbook repo
https://github.com/Chainlit/chainlit
713 stars 270 forks source link

On autogen example #37

Open antoineross opened 10 months ago

antoineross commented 10 months ago

Somehow getting an error "[Errno 2] No such file or directory: 'workspace/tmp_code_99073d0c5dccadd93da41584f422b19a.py", leads to the chatbot hanging on the front-end, but working in the backend. Frontend hanging:

Screenshot 2023-10-31 at 2 27 09 AM

Backend completion:

Screenshot 2023-10-31 at 2 28 37 AM
antoineross commented 10 months ago

Here's the code: from typing import Dict, Optional, Union

import autogen from autogen import Agent, AssistantAgent, UserProxyAgent, config_list_from_json import chainlit as cl

CONTEXT = """Access the XML data from the following link: https://sapes5.sapdevcenter.com/sap/opu/odata/sap/ZPDCDS_SRV/SEPMRA_I_Product_E. Utilize the libraries 'urllib.request' and 'xml.etree.ElementTree' for sending a GET request and parsing the XML data, respectively.

The XML data uses three namespaces:

  1. Default namespace: 'http://www.w3.org/2005/Atom'
  2. 'd': 'http://schemas.microsoft.com/ado/2007/08/dataservices'
  3. 'm': 'http://schemas.microsoft.com/ado/2007/08/dataservices/metadata'

The product details are nested within 'entry' -> 'content' -> 'm:properties' tags. Inside 'm:properties', each product detail is stored in a 'd:TagName' format.

To access details of a specific product, iterate through all 'entry' tags, and for each entry, navigate to the 'd:Product' tag within 'm:properties' to check the product ID. If the product ID matches the desired ID, extract all child tags within 'm:properties' to get the product details.

For queries like 'show the products from supplier 100000076', iterate through all 'entry' tags, and for each entry, navigate to the 'd:Supplier' tag within 'm:properties' to check the supplier ID. If the supplier ID matches the desired ID, extract and accumulate the 'd:Product' tags within 'm:properties' from all matching entries to list all products from that supplier.

Ensure to always check the length of the context to avoid hitting the context limit. Do not express gratitude in responses. If "Thank you" or "You're welcome" are said in the conversation, send a final response. Your final response is just "TERMINATE", do not add other sentences."""

TASK = "Show the products from supplier 100000076."

async def ask_helper(func, kwargs): res = await func(kwargs).send() while not res: res = await func(**kwargs).send() return res

class ChainlitAssistantAgent(AssistantAgent): def send( self, message: Union[Dict, str], recipient: Agent, request_reply: Optional[bool] = None, silent: Optional[bool] = False, ) -> bool: cl.run_sync( cl.Message( content=f'Sending message to "{recipient.name}":\n\n{message}', author="AssistantAgent", ).send() ) super(ChainlitAssistantAgent, self).send( message=message, recipient=recipient, request_reply=request_reply, silent=silent, )

class ChainlitUserProxyAgent(UserProxyAgent): def get_human_input(self, prompt: str) -> str: if prompt.startswith( "Provide feedback to assistant. Press enter to skip and use auto-reply" ): res = cl.run_sync( ask_helper( cl.AskActionMessage, content="Continue or provide feedback?", actions=[ cl.Action( name="continue", value="continue", label="✅ Continue" ), cl.Action( name="feedback",value="feedback", label="💬 Provide feedback"), ], ) ) if res.get("value") == "continue": return ""

    reply = cl.run_sync(ask_helper(cl.AskUserMessage, content=prompt, timeout=60))

    return reply["content"].strip()

def send(
    self,
    message: Union[Dict, str],
    recipient: Agent,
    request_reply: Optional[bool] = None,
    silent: Optional[bool] = False,
):
    cl.run_sync(
        cl.Message(
            content=f'*Sending message to "{recipient.name}"*:\n\n{message}',
            author="UserProxyAgent",
        ).send()
    )
    super(ChainlitUserProxyAgent, self).send(
        message=message,
        recipient=recipient,
        request_reply=request_reply,
        silent=silent,
    )

@cl.on_chat_start async def on_chat_start(): config_list = config_list_from_json(env_or_file="OAI_CONFIG_LIST") coding_assistant = ChainlitAssistantAgent( name="SAP_ABAP_Code_Planner", llm_config={"config_list": config_list}, system_message="""Engineer. You follow an approved plan. You write python/shell code to solve tasks. Wrap the code in a code block that specifies the script type. The user can't modify your code. So do not suggest incomplete code which requires others to modify. Don't use a code block if it's not intended to be executed by the SAP_DATA_and_AI Engineer. Don't include multiple code blocks in one response. Do not ask others to copy and paste the result. Check the execution result returned by the SAP_DATA_and_AI Engineer. If the result indicates there is an error, fix the error and output the code again. Suggest the full code instead of partial code.""" ) coding_runner = ChainlitUserProxyAgent( name="SAP_DATA_and_AI_Engineer", llm_config={"config_list": config_list}, human_input_mode="NEVER", code_execution_config={ "last_n_messages": 3, "work_dir": "workspace", "use_docker": True, }, system_message="""A Coding Engineer. Use python to run code. Interact with the SAP_ABAP_Code_Planner to run code. Report the result. You are an AI model capable of executing code.""" ) analysis_agent = ChainlitAssistantAgent( name="SAP_Analysis_Agent", llm_config={"config_list": config_list}, system_message="""Analysis agent. You analyse the data outputted by SAP_DATA_and_AI_Engineer when necessary. Be concise and always summarize the data when possible. Communicate with the Query_Agent when the data is analyzed.""" ) user_proxy = ChainlitUserProxyAgent( name="Query_Agent", max_consecutive_auto_reply=3, code_execution_config=False, system_message="""Manager. Administrate the agents on a plan. Communicate with the SAP_ABAP_Code_Planner to plan the code. Communicate with the SAP_Analysis_Agent when we want to analyse the data. Reply TERMINATE at the end of your sentence if the task has been solved at full satisfaction. Otherwise, reply CONTINUE, or the reason why the task is not solved yet.""" ) groupchat = autogen.GroupChat(agents=[user_proxy, coding_assistant, coding_runner, analysis_agent], messages=[], max_round=50) manager = autogen.GroupChatManager(groupchat=groupchat)

await cl.Message(content=f"""Datascience Agent Team 👾
                         \n\nStarting agents on task: {TASK}...""").send()
await cl.make_async(user_proxy.initiate_chat)(
    manager,
    message=TASK+CONTEXT,
)
willydouhard commented 9 months ago

Did you manage to fix the issue?