Closed Jesse-jApps closed 6 years ago
Ouch, good catch!
I think we should move this to the validation function; here we are setting the net in evaluation mode to freeze batch norms, etc. already:
What about decorating validation
with @torch.no_grad()
?
Sure that would work, too!
I oversaw, that no_grad() as decorator is only available since pytorch 0.4.1. I committed a workaround using contextmanager. As soon as robosat is upgraded for 0.4.1 the additional contextmanager code can be discarded.
Merci!
no_grad()
is available since pytorch 0.4.0 and prevents possible 'out of memory' errors during validation.