vsitzmann / siren

Official implementation of "Implicit Neural Representations with Periodic Activation Functions"
MIT License
1.74k stars 247 forks source link

allows to take derivative w.r.t. input #12

Open pragmascript opened 4 years ago

pragmascript commented 4 years ago

In the notebook "explore_siren.ipynb" Can you explain why the line coords.clone().detach().requires_grad_(True) is necessary? Why wouldn't you be able to get a gradient otherwise?

    def forward(self, coords):
        coords = coords.clone().detach().requires_grad_(True) # allows to take derivative w.r.t. input
        output = self.net(coords)
        return output, coords        
kwea123 commented 4 years ago

I have a similar question, why is clone.detach necessary? Shouldn't coords.requires_grad_(True) be enough? Since the coords are not in the training parameters they won't be updated