Open pjessesco opened 2 years ago
Hi @pjessesco ,
What's do you mean by "can't optimize"? Are the resulting gradients wrong?
Hi @Speierers , sorry for ambiguous wording.
free_graph = True
free_graph = False
I printed as
print('Iteration %03i: error=%g' % (it, err_ref[0]), ek.gradient(params['my_envmap.data']))
, gradient value is empty if the option is false.
That is likely a bug in the enoki AD backend. Unfortunately we are working hard on the upcoming release of the library I won't probably have the time to look into this in the near future.
In any case, could you send me the whole python snippet to reproduce this, I could take a brief look to see if there is anything obvious.
Here is the script, it's almost the same with a bunny example except for the option.
This looks reasonable. It is likely a bug in the enoki AD backend then.
Summary
Hi, I'm trying to do a differentiable rendering without
free_graph
option. It looks like if the option is enabled, the visited node is erased after used to calculate gradient while traversing the graph.My goal is to specify nodes' gradient in their label in graph (using
ek.graphviz
), so I don't want to let my graph be reduced while backpropagation.However I couldn't optimize parameter with
free_graph=False
... thanks for your reply.System configuration
master b92ddc234d1cb3510cba4db1601cd7f03e65a138 Ubuntu CUDA 11.2
Steps to reproduce
In
invert_bunny.py
example, fix as below