Open VedantR3907 opened 2 days ago
To modify the Accumulate
response synthesizer to skip documents entirely if no context is provided, you can add a check within the Accumulate
class to ensure that it only processes documents with context. If no context is provided, it should skip those documents and not include any response from them in the final output.
Here is an example of how you might modify the Accumulate
class:
from llama_index.legacy.response_synthesizers.base import BaseSynthesizer
class Accumulate(BaseSynthesizer):
def __init__(self, text_qa_template=None, service_context=None, output_cls=None, streaming=False, use_async=False):
super().__init__(service_context=service_context, streaming=streaming)
self._text_qa_template = text_qa_template or DEFAULT_TEXT_QA_PROMPT_SEL
self._use_async = use_async
self._output_cls = output_cls
def synthesize(self, documents):
responses = []
for doc in documents:
if not doc.context: # Check if context is provided
continue # Skip the document if no context is provided
response = self._service_context.llm.predict(self._text_qa_template, context_str=doc.context)
responses.append(response)
return self._accumulate_responses(responses)
def _accumulate_responses(self, responses):
# Implement the logic to accumulate responses
return " ".join(responses)
In this example, the synthesize
method of the Accumulate
class is modified to check if the context
attribute of each document is provided. If no context is provided (if not doc.context
), the document is skipped (continue
). This ensures that only documents with context are processed and included in the final output [1][2].
To continue talking to Dosu, mention @dosu.
Can we change prompt instead of doing this?
Yes, you can change the prompt used by the accumulate response synthesizer to skip generating a response for documents that don't provide context for the query without modifying the Accumulate
class. You can achieve this by updating the prompt template to include logic that checks if the document provides context for the query and skips generating a response if it doesn't.
Here is an example of how you might modify the prompt template:
from llama_index.core.prompts.base import PromptTemplate
from llama_index.core.response_synthesizers import get_response_synthesizer, ResponseMode
from llama_index.core.service_context import ServiceContext
class CustomPromptTemplate(PromptTemplate):
def partial_format(self, query_str: str) -> str:
# Custom logic to check if the document provides context for the query
return f"""
{{% if context_str contains '{query_str}' %}}
Answer the following question based on the context: {query_str}
{{% else %}}
No relevant context found.
{{% endif %}}
"""
# Use the custom prompt template in the Accumulate synthesizer
custom_prompt_template = CustomPromptTemplate("{context_str}{query_str}", prompt_type="QUESTION_ANSWER")
service_context = ServiceContext.from_defaults()
accumulate_synthesizer = get_response_synthesizer(
service_context=service_context,
text_qa_template=custom_prompt_template,
response_mode=ResponseMode.ACCUMULATE,
)
# Example usage
query_str = "What is?"
texts = [
"This is bar",
"This is foo",
]
response = accumulate_synthesizer.get_response(text_chunks=texts, query_str=query_str)
print(response)
This custom prompt template uses a conditional statement to check if the context_str
contains the query_str
. If it does, it generates a response; otherwise, it returns "No relevant context found." This way, you can skip generating responses for documents that don't provide context for the query [1][2].
To continue talking to Dosu, mention @dosu.
how can I call the update_prompts method and pass the prompt in Accumulate.
To call the update_prompts
method and pass a custom prompt in the Accumulate
response synthesizer, follow these steps:
update_prompts
method to update the prompt in the Accumulate
response synthesizer.Here is an example:
from llama_index.core.prompts.base import PromptTemplate
from llama_index.core.response_synthesizers.accumulate import Accumulate
# Define your custom prompt template
custom_prompt_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query in a detailed manner.\n"
"Query: {query_str}\n"
"Answer: "
)
custom_prompt = PromptTemplate(custom_prompt_str)
# Update the prompt in the Accumulate response synthesizer
accumulate_synthesizer = Accumulate()
accumulate_synthesizer.update_prompts(
{"text_qa_template": custom_prompt}
)
This code snippet defines a custom prompt template and updates the text_qa_template
in the Accumulate
response synthesizer using the update_prompts
method [1][2][3].
To continue talking to Dosu, mention @dosu.
Question Validation
Question
I was using the accumulate response synthesizer and wanted to know if I could change the prompt of the same. Basically, the accumulate generates many responses for the retrieved chunks of data. Now if the document doesn't provide the context for the query written, Then it writes as "No context is provided in the document [Document Name or something], Now Instead can we change the function not to write those things in the final output and just skip the document if no context I provided, and skip the response from that document.