run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.37k stars 5.19k forks source link

[Question]: How to Modify Agent Prompt and Query Engine defaults Prompts #15646

Open WILLOSCAR opened 2 months ago

WILLOSCAR commented 2 months ago

Question Validation

Question

After using chat_engin = index.as_chat_engine(), I discovered that chat_engine is an openAIAgent object, which behaves very differently compared to query_engine. Do I need to modify the tool in it? I'm looking for guidance on:

  1. Modifying the agent's default behavior prompt: How can I customize the default prompt that the agent uses during interactions?
  2. Updating the query_engine_tool settings: Specifically, how can I adjust the default text_qa_template that is used during retrievals? For example, in query_engine, we can update prompts like this:
    
    query_engine.update_prompts(
       {"response_synthesizer:summary_template": new_summary_tmpl}
    )
dosubot[bot] commented 2 months ago

To address your questions on modifying the agent's default behavior prompt and updating the query_engine_tool settings, follow these steps:

1. Modifying the Agent's Default Behavior Prompt

You can customize the default prompt that the openAIAgent uses during interactions by using the from_tools method with either the system_prompt or prefix_messages parameter. Here are examples for both:

Using system_prompt:

from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI

llm = OpenAI(model="gpt-3.5-turbo-0613")

agent = OpenAIAgent.from_tools(
    [multiply_tool, add_tool],
    llm=llm,
    verbose=True,
    system_prompt="Your custom system prompt here."
)

Using prefix_messages:

from llama_index.agent.openai import OpenAIAgent
from llama_index.llms.openai import OpenAI
from llama_index.core.base.llms.types import ChatMessage

llm = OpenAI(model="gpt-3.5-turbo-0613")

prefix_messages = [ChatMessage(content="You are a helpful assistant.", role="system")]

agent = OpenAIAgent.from_tools(
    [multiply_tool, add_tool],
    llm=llm,
    verbose=True,
    prefix_messages=prefix_messages,
)

Note: You cannot specify both system_prompt and prefix_messages simultaneously [1][2].

2. Updating the query_engine_tool Settings

To adjust the default text_qa_template settings in the query_engine_tool, you can define custom prompts and pass them to the query engine. Here is how you can do it:

Define Custom Prompts:

from llama_index.core.llms import ChatMessage, MessageRole
from llama_index.core import ChatPromptTemplate

# Define the custom prompt strings
qa_prompt_str = (
    "Context information is below.\n"
    "---------------------\n"
    "{context_str}\n"
    "---------------------\n"
    "Given the context information and not prior knowledge, "
    "answer the question: {query_str}\n"
)

refine_prompt_str = (
    "We have the opportunity to refine the original answer "
    "(only if needed) with some more context below.\n"
    "------------\n"
    "{context_msg}\n"
    "------------\n"
    "Given the new context, refine the original answer to better "
    "answer the question: {query_str}. "
    "If the context isn't useful, output the original answer again.\n"
    "Original Answer: {existing_answer}"
)

# Create ChatPromptTemplate objects
chat_text_qa_msgs = [
    ChatMessage(role=MessageRole.SYSTEM, content="Always answer the question, even if the context isn't helpful."),
    ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
]
text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)

chat_refine_msgs = [
    ChatMessage(role=MessageRole.SYSTEM, content="Always answer the question, even if the context isn't helpful."),
    ChatMessage(role=MessageRole.USER, content=refine_prompt_str),
]
refine_template = ChatPromptTemplate(chat_refine_msgs)

Use the Custom Templates in the Query Engine:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI

documents = SimpleDirectoryReader("./data/paul_graham/").load_data()
llm = OpenAI(model="gpt-3.5-turbo", temperature=0.1)
index = VectorStoreIndex.from_documents(documents)

# Query using the custom templates
print(
    index.as_query_engine(
        text_qa_template=text_qa_template,
        refine_template=refine_template,
        llm=llm,
    ).query("Who is Joe Biden?")
)

This setup allows you to customize the prompts used during interactions and retrievals [3][4][5].

By following these steps, you can effectively modify the agent's default behavior prompt and update the query_engine_tool settings to suit your needs.

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 months ago
  1. An openai agent has no prompts. It's all the chat history. The tools get sent directly over openais api
  2. You can update the query engine prompts exactly how you wrote before converting it input a tool
  3. Tool names and descriptions are also important, which can be set when creating the tool
WILLOSCAR commented 2 months ago
  1. An openai agent has no prompts. It's all the chat history. The tools get sent directly over openais api
  2. You can update the query engine prompts exactly how you wrote before converting it input a tool
  3. Tool names and descriptions are also important, which can be set when creating the tool

Can it automatically initialize when using index.as_chat_engine() for initialization? I looked through the source code layer by layer and it doesn't seem to be working:

chat_engine = index.as_chat_engine(
    chat_mode='best',
    verbose=True,
    similarity_top_k=10,
    # response_synthesizer=response_synthesizer,
    node_postprocessors=[
        LLMRerank(
            choice_batch_size=2,
            top_n=3,
        )
    ],
    text_qa_template=yueyu_system_prompt,
)

I would like to ask one more question: how can I force trigger a specific tool? For example, I want to call the RAG's queryEngineTool every time."