Closed yousonnet closed 7 months ago
@gagb @IANTHEREAL @jtrugman fyi
@yousonnet would you like the Assistant to terminate when it receives a TERMINATE
message? I think this capability is not yet supported, the implementation of the gpt_assistant does not execute a termination path now.
What are your thoughts about it? @gagb
@yousonnet would you like the Assistant to terminate when it receives a
TERMINATE
message? I think this capability is not yet supported, the implementation of the gpt_assistant does not execute a termination path now.What are your thoughts about it? @gagb
thx,i got it,i personally implemented a_check_is_termination_func for GPTAssistant on a fork,would u take this kinda pr or not?the implementation doesn't take original ConversableAgent classmethod though.
@yousonnet would you like the Assistant to terminate when it receives a
TERMINATE
message? I think this capability is not yet supported, the implementation of the gpt_assistant does not execute a termination path now. What are your thoughts about it? @gagbthx,i got it,i personally implemented a_check_is_termination_func for GPTAssistant on a fork,would u take this kinda pr or not?the implementation doesn't take original ConversableAgent classmethod though.
just found out using parent class's methods works.
Describe the bug
when the proxy agent send message TERMINATE to instance of GPTAssistant with is_termination_msg:lambda x:True can't end conversation
Steps to reproduce
from autogen import AssistantAgent, UserProxyAgent, ConversableAgent, GroupChat, GroupChatManager, config_list_from_json from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent from constants import openai_api_key from clients_setup import clients from typing import List from time import sleep
config_list = config_list_from_json(env_or_file="config.json")
def is_termination_msg(x):
print(x)
assistant = GPTAssistantAgent(name="Assistant", llm_config={ "seed": 42, # seed for caching and reproducibility "config_list": config_list, # a list of OpenAI API configurations "assistantid": "asst", "temperature": 0, # temperature for sampling
"request_timeout": 120, # timeout
}, is_termination_msg=lambda x: True
)
def multi_accounts_tweet(tweets: List[str]): counter = 0
for (client, tweet) in zip(clients, tweets):
assistant.register_function( function_map={"multi_accounts_tweet": multi_accounts_tweet}) user_proxy = UserProxyAgent(name="proxy_agent", default_auto_reply="TERMINATE", human_input_mode="NEVER",
is_termination_msg=is_termination_msg,
user_proxy.initiate_chat( assistant, message="""please generate tweets,and press them by multi_accounts_tweet in one time""",)
Expected Behavior
No response
Screenshots and logs
No response
Additional Information
No response