Open errolyan opened 5 years ago
ValueError: optimizing a parameter that doesn't require gradients
Can I see what part of code occur the error?
Using torch==0.4.1 seems to resolve this. Not sure yet if this would affect anything else later on.
torch==0.4.1
ValueError: optimizing a parameter that doesn't require gradients