zou-group / textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
http://textgrad.com/
MIT License
1.85k stars 157 forks source link

How to use with RAG? #34

Open v5out opened 5 months ago

v5out commented 5 months ago

TextGrad is exciting! Very much appreciate your efforts.

How to apply textgrad to retrieval-augmented-generation (RAG) for question & answer?

With RAG there are three inputs: the source material (text), the question and the answer (which needs to be evaluated and improved).

Not clear to me how to handle those with one or more Variables, etc.

Something like?

x = Variable("Source material: "+ + " and a question about it: " + [question], role_description="Answering a question on source material", requires_grad=True)

samgregson commented 4 months ago

I agree with this, the examples are somewhat lacking. I would love to use TextGrad for prompts which have a combination of non-optimisable context as well as optimisable text (like RAG above). I'd also like to see examples of chains of LLM calls (not just a single LLM as per examples). Finally, the paper mentions optimisation for tool calling, how do we achieve this? Thanks for the awesome development, just wish I new how to use it in more complex settings!

mertyg commented 4 months ago

@v5out thank you for your interest! @vinid will soon put together an example for RAG!

@samgregson thank you for the kind words -- we totally agree we should have more examples and they are lacking right now.. Thanks for bearing with us in the meantime. We will certainly add more examples showing how to optimize parts of prompts, more complex setups, and more functionality soon. Sorry for the delay, we're currently a small team working on this, but hope to grow it along with our community!

jaggiK commented 1 week ago

Thank you for maintaining this library. I am also interested in the implementation of these features. Are there any updates available?