facebookresearch / higher

higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
Apache License 2.0
1.58k stars 123 forks source link

once_differentiable #137

Open aooating opened 1 year ago

aooating commented 1 year ago

when running the code: diffopt.step(buff_losses) the following problem occurs: RuntimeError: trying to differentiate twice a function that was markedwith @once_differentiable but when setting track_higher_grads: bool = False, it runs correctly. I want to know why this issue occurs. Thanks a lot for answering my question!

aooating commented 1 year ago

when running the code: diffopt.step(buff_losses) the following problem occurs: RuntimeError: trying to differentiate twice a function that was markedwith @once_differentiable but when setting track_higher_grads: bool = False, it runs correctly. I want to know why this issue occurs. Thanks a lot for answering my question!

does this setting have any effect on the final result?