zou-group / textgrad

TextGrad: Automatic ''Differentiation'' via Text -- using large language models to backpropagate textual gradients.
http://textgrad.com/
MIT License
1.51k stars 116 forks source link

Duplicate line of code when computing _backward_idempotent #101

Open linyuhongg opened 1 month ago

linyuhongg commented 1 month ago

In variable.py, the operation + calls _backward_idempotent for autograd, however, this function has a duplicate line of code line 349 and line 355 are the same code resulting in duplicate gradients for variables involved in the Function '+'.

linyuhongg commented 1 month ago

Perhaps it was meant to work with _reduce_meta? However, it seems like if its not defined, it results in duplicate gradients when using +