run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.29k stars 4.66k forks source link

[Question]: Use Prompt Template with Chat Engine #14272

Open anantgupta129 opened 2 weeks ago

anantgupta129 commented 2 weeks ago

Question Validation

Question

Will ReAct Agent is noticed that using prompt template was mor reliable and controlled output compared to system prompt

agent = ReActAgent.from_tools(tool_spec, llm=llm, verbose=True)
agent.update_prompts({"agent_worker:system_prompt": REACT_SYSTEM_PROMPT})
agent.reset()

How to use prompt template with index as chat engine or simple chat engine

index = VectorStoreIndex.from_vector_store(vector_store=vector_store)
engine = index.as_chat_engine(
    chat_mode="condense_plus_context", streaming=True, system_prompt=system_prompt
)

chatEngine = SimpleChatEngine.from_defaults(streaming=True) instead i want to use prompt template with chat engine

dosubot[bot] commented 2 weeks ago

To use a prompt template with the chat engine instead of the system prompt, you can define a custom chat prompt template using ChatPromptTemplate and then format it for use with the chat API. Here is an example:

from llama_index.legacy.prompts import ChatPromptTemplate, ChatMessage, MessageRole

# Define the message templates
message_templates = [
    ChatMessage(content="You are an expert system.", role=MessageRole.SYSTEM),
    ChatMessage(
        content="Generate a short story about {topic}",
        role=MessageRole.USER,
    ),
]

# Create the chat prompt template
chat_template = ChatPromptTemplate(message_templates=message_templates)

# Format the messages for the chat API
messages = chat_template.format_messages(topic="a brave knight")

# Alternatively, convert to a text prompt for the completion API
prompt = chat_template.format(topic="a brave knight")

Alternatively, you can use the ChatPromptTemplate.from_messages method for a more concise setup:

from llama_index.legacy.prompts import ChatPromptTemplate

# Define the message templates using from_messages
message_templates = [
    ("system", "You are an expert system."),
    ("user", "Generate a short story about {topic}"),
]

# Create the chat prompt template
chat_template = ChatPromptTemplate.from_messages(message_templates)

# Format the messages for the chat API
messages = chat_template.format_messages(topic="a brave knight")

# Alternatively, convert to a text prompt for the completion API
prompt = chat_template.format(topic="a brave knight")

This code snippet demonstrates how to create a custom chat prompt template and format it for use with the chat API. The ChatPromptTemplate class allows you to define a sequence of ChatMessage objects with specified roles and content, which can then be formatted with specific variables for use in the chat engine. The from_messages method provides a more concise way to define the chat prompt template.

Additionally, you can use advanced features such as partial formatting and output parsers. For example:

from llama_index.legacy.prompts import ChatPromptTemplate, ChatMessage, MessageRole
from llama_index.legacy.types import BaseOutputParser

class MockOutputParser(BaseOutputParser):
    def __init__(self, format_string: str) -> None:
        self._format_string = format_string

    def parse(self, output: str) -> Any:
        return {"output": output}

    def format(self, query: str) -> str:
        return query + "\n" + self._format_string

output_parser = MockOutputParser(format_string="output_instruction")

chat_template = ChatPromptTemplate(
    message_templates=[
        ChatMessage(
            content="This is a system message with a {sys_param}",
            role=MessageRole.SYSTEM,
        ),
        ChatMessage(content="hello {text} {foo}", role=MessageRole.USER),
    ],
    output_parser=output_parser,
)

messages = chat_template.format_messages(
    text="world", foo="bar", sys_param="sys_arg"
)

This example shows how to integrate an output parser with your chat prompt template, allowing for more complex formatting and parsing of the chat messages [1][2][3][4][5].

To continue talking to Dosu, mention @dosu.