By default, Dr.Jit’s AD system destructs the enqueued input graph during AD traversal. But the AOV integrator render_forward/render_backward functions consists of separate dr::forward_to/dr::backward_from calls for the AOV channels and inner integrator output. Hence we need to perform repeated propagation of gradients of a shared AD subgraph.
By default, Dr.Jit’s AD system destructs the enqueued input graph during AD traversal. But the AOV integrator
render_forward/render_backward
functions consists of separatedr::forward_to
/dr::backward_from
calls for the AOV channels and inner integrator output. Hence we need to perform repeated propagation of gradients of a shared AD subgraph.