zou-group / textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
http://textgrad.com/
MIT License
1.41k stars 111 forks source link

Problems in running the quickstart #50

Closed HelloWorldLTY closed 3 weeks ago

HelloWorldLTY commented 1 month ago

Hi, thanks for your great work! I am trying to follow the three steps shown in https://textgrad.com/index.html#application but failed in the second step:

image

Did I miss anything? Thanks a lot.

vinid commented 1 month ago

Hello! you need to set an engine as global in this context.

import textgrad as tg

tg.set_backward_engine("gpt-4o", override=True)

Could you point me to where the example is? We might need to update that!

HelloWorldLTY commented 1 month ago

Sure, I will have a try. The original design does not include requests for such steps:

import textgrad as tg
# Step 1: Get an initial response from an LLM
model = tg.BlackboxLLM("gpt-4o")
punchline = model(tg.Variable("write a punchline for my github package about optimizing compound AI systems", role_description="prompt", requires_grad=False))
punchline.set_role_description("a concise punchline that must hook everyone")

The example can be found in : https://textgrad.readthedocs.io/en/latest/index.html