Open aosama opened 2 months ago
@IANTHEREAL could you take a look at this?
I'm not sure if I've understood this correctly. You mean we should initilize the self.system
using the instruction populated from existing openai agent?
And did this bug cause you to encounter any unexpected behavior? Please tell me more, will help me to understand it @aosama
Hello @IANTHEREAL,
Here is how to reproduce
my use case was: using openAI console, define a RAG assistant and provide it with some files to search through the storage unit Use the RAG assistant through autogen as a knowledge agent Autogen doesn't retrieve pre-defined storage unit and file search function for a pre-existing agent through its ID from OPENAI.com
@ekzhu if other guys can help to handle this issue, I have been pretty busy lately
I am still a bit of confusing. Maybe you can review the code below, and tell more about what you are expected.
import logging
from autogen import UserProxyAgent, config_list_from_json
from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
config_list = config_list_from_json("OAI_CONFIG_LIST", file_location="notebook")
llm_config = {"config_list": config_list}
gpt_assistant = GPTAssistantAgent(
name="assistant",
llm_config=llm_config,
assistant_config={"assistant_id": assistant.id}
)
user_proxy = UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
code_execution_config=False,
)
user_proxy.initiate_chat(gpt_assistant, message="When will we get to san francisco according to our traveling plan?")
The output show it can work well on my traveling plan.
import logging
from autogen import UserProxyAgent, config_list_from_json
from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
config_list = config_list_from_json("OAI_CONFIG_LIST", file_location="notebook")
llm_config = {"config_list": config_list}
gpt_assistant = GPTAssistantAgent(
name="assistant",
llm_config=llm_config,
assistant_config={"assistant_id": assistant.id}
)
user_proxy = UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
code_execution_config=False,
)
user_proxy.initiate_chat(gpt_assistant, message="When will we get to san francisco according to our traveling plan?")
Then let's print the instructions, tools, and tools resources
print("system message", gpt_assistant.system_message)
print("instructions", gpt_assistant.openai_assistant.instructions)
print("tools", gpt_assistant.openai_assistant.tools)
print("tool_resources", gpt_assistant.openai_assistant.tool_resources)
The output shows these properties are set correctly
system message You are an expert Traveling Plan Assistant. Use you knowledge base to answer questions
instructions You are an expert Traveling Plan Assistant. Use you knowledge base to answer questions
tools [FileSearchTool(type='file_search')]
tool_resources ToolResources(code_interpreter=None, file_search=ToolResourcesFileSearch(vector_store_ids=['vs_h3kWOUJEY5NpiI7AcOs2Nh9k']))
@aosama can you take a look at IAN's comment?
I will re-execute the same scenarios again with OPENAI and will report back here
Describe the bug
when providing an assistant ID for GPTAssistantAgent. the code pathway at line 117 always has a None value for variables "instructions" and "specified_tools". this is because the data within self._openai_assistant is never copied to local variables "instructions" and "specified_tools".
Steps to reproduce
to reproduce, using 0.2.27 .... assuming an agent as a knowledge base for autogen documentation.
llm_config = {"model": "gpt-4-turbo", "api_key": os.environ["OPENAI_API_KEY"]}
gpt_assistant = GPTAssistantAgent( name="assistant", llm_config=llm_config, assistant_config={"assistant_id": "XXXXXXXXX"} )
user_proxy = UserProxyAgent( name="user_proxy", code_execution_config={ "work_dir": "coding" }, human_input_mode="TERMINATE" # Adjust as needed for your use case )
user_proxy.initiate_chat(gpt_assistant, message="tell me 2 things about autogen?" , max_turns=1)
https://github.com/microsoft/autogen/blob/main/autogen/agentchat/contrib/gpt_assistant_agent.py
put a break point at line 121 and observe the None value.
Model Used
got-4-turbo .. but doesn't matter
Expected Behavior
the code should have loaded the existing agent in openAI with its tools and instructions.
Screenshots and logs
No response
Additional Information
No response