run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.54k stars 5.02k forks source link

[Question]: Are there anyway to have Node Postprocessor into Function Calling Agent #14891

Open ryann-sportsbet opened 2 months ago

ryann-sportsbet commented 2 months ago

Question Validation

Question

Hi guys,

Has anyone successfully incorporated Node Postprocessor into Function Calling Agent?

I've researching the doc and the source code. It seems like the Node Postprocessor only available for Query Engine.

dosubot[bot] commented 2 months ago

@ryann-sportsbet ¡Hola! I'm here to help you with bugs, questions, and becoming a contributor. Let's solve this puzzle together! 🤖

Yes, it is possible to incorporate a Node Postprocessor into a Function Calling Agent (Query Engine). Here is an example of how to add a SimilarityPostprocessor to the RetrieverQueryEngine:

from llama_index.core import VectorStoreIndex, get_response_synthesizer
from llama_index.core.retrievers import VectorIndexRetriever
from llama_index.core.query_engine import RetrieverQueryEngine
from llama_index.core.postprocessor import SimilarityPostprocessor

# build index
index = VectorStoreIndex.from_documents(documents)

# configure retriever
retriever = VectorIndexRetriever(
    index=index,
    similarity_top_k=10,
)

# configure response synthesizer
response_synthesizer = get_response_synthesizer()

# assemble query engine
query_engine = RetrieverQueryEngine(
    retriever=retriever,
    response_synthesizer=response_synthesizer,
    node_postprocessors=[SimilarityPostprocessor(similarity_cutoff=0.7)],
)

# query
response = query_engine.query("What did the author do growing up?")
print(response)

This example demonstrates the successful incorporation of a SimilarityPostprocessor into the RetrieverQueryEngine [1].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 2 months ago

What is the tool you are giving the agent? If its a query engine, you can supply the node postprocessors there

If its something else, just make a FunctionTool and call whatever processing you need in there

ryann-sportsbet commented 1 month ago

@logan-markewich it is a mixture of query engine and function calling with FunctionCallingAgentWorker. I guess I can have node post processor in RetrieverQueryEngine then pass it to QueryEngineTool then pass it to FunctionCallingAgentWorker

What if the FunctionCalingAgentWorker contains custom function that query data from API only?

logan-markewich commented 1 month ago

You could do that too. In a custom function, you can do whatever you want with the queried data

logan-markewich commented 1 month ago
from llama_index.core.tools import FunctionTool

def query(query_str: str) -> str:
  """Useful for querying information about X."""
   ...

tool = FunctionTool.from_defaults(fn=query)