zou-group / textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
http://textgrad.com/
MIT License
1.52k stars 117 forks source link

Designing Loss function with non-binary outputs #113

Open nikhilk7153 opened 3 weeks ago

nikhilk7153 commented 3 weeks ago

Hello,

I am looking at the textgrad loss function for prompt optimization cases. I noticed that all of them supported binary cases where the answer is either correct/incorrect. I was wondering if this code base could be modified for supporting cases where the answer is an integer/decimal for metrics such as F1, precision, or recall? If so, how would I go about implementing this?

Thanks,

Nikhil

simra-shahid commented 2 weeks ago

In addition to this, can we have multiple outputs and multiple loss optimizations? For example, a classification output and its reasoning. Can we optimize for both?

mertyg commented 2 weeks ago

Hi @nikhilk7153 -- yes this should be fairly possible. For instance, you can let the inner function of this class be a function computing F1/precision/recall, e.g., StringBasedFunction(f1, "computes the precision of the answer").

hello @simra-shahid ! I don't think i fully get this question. Do you mind we would like to simultaneously refine the reasoning and the final answer for a given question, akin to what we call "Instance Optimization" in the paper? I'm sorry for not getting this, but an example would be very helpful!