Closed reiinakano closed 7 years ago
as soon as torch does not tell:
"RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed. Please specify retain_variables=True when calling backward for the first time."
then you are not accumulating gradients. It works because the different losses are not sharing any parameters. But you are right, The code could be cleaner with a single call of backward. Actually did this tutorial when I discovered Pytorch, so it has been written with clumsiness. When I will have time, I will make a nicer version.
Cool. Thanks!
Hi, not sure if this is the right place to ask questions, but I'm working through the neural style transfer tutorial and am confused about something.
What is the purpose of the
backward
method inContentLoss
andStyleLoss
?If we remove the
backward
method, won't this work as well for theclosure
function inrun_style_transfer
?On a related note, won't multiple
backward
calls in the original code accumulate the gradients for the image? Why is it okay to do this? Am I wrong in assuming that you should only callbackward
once? I'm new to Pytorch so I apologize if I'm missing anything fundamental. Thanks!EDIT: Tagging the author @alexis-jacq if you don't mind :)