In the notebook "explore_siren.ipynb"
Can you explain why the line coords.clone().detach().requires_grad_(True) is necessary? Why wouldn't you be able to get a gradient otherwise?
def forward(self, coords):
coords = coords.clone().detach().requires_grad_(True) # allows to take derivative w.r.t. input
output = self.net(coords)
return output, coords
I have a similar question, why is clone.detach necessary? Shouldn't coords.requires_grad_(True) be enough? Since the coords are not in the training parameters they won't be updated
In the notebook "explore_siren.ipynb" Can you explain why the line
coords.clone().detach().requires_grad_(True)
is necessary? Why wouldn't you be able to get a gradient otherwise?