Open tchang1997 opened 3 years ago
It's saying ctx is not defined for me, am I missing something?
Missed a typo; thanks for pointing that out. I overwrote my previous changes on my branch. Does it work for you now?
The ctx not defined error is solved, but there's a new error. It seems like save_for_backward can only save tensors, not dim(int) and consensus_type(str): TypeError: save_for_backward can only save variables, but argument 1 is of type int
I've created a PR on your fork with the code that works for me: PR
Thank you so much for sharing your implementation of TPN!
Problem
I've been trying to get it to work in one of my own projects -- however, I ran into the same issue as mentioned #28, in which the user pastes a stack trace with error message
"Legacy autograd function with non-static forward method is deprecated."
This occurs when you try to callforward()
with the old code when the averaging consensus (_SimpleConsensus
) is used.Environment
Summary of changes
In order to make the
_SimpleConsensus
class (subclassingtorch.Autograd.Function
) compatible with PyTorch >1.3:__init__
method from_SimpleConsensus
apply
static method instead offorward
for passing input tensor through the_SimpleConsensus
objectforward()
method of_SimpleConsensus
usesctx.save_for_backward(args)
to cache input tensorx
,dim
, andconsensus_type
.self.shape
is no longer a member of_SimpleConsensus
; it is reconstructed by retrievingx
fromctx.saved_tensors
and callingx.size()
in each call tobackward()
.This is consistent with the template given in the PyTorch docs, which I referenced.
Discussion
The changes in this PR work for me -- I am able to run
forward()
without issue now. However, as a disclaimer, due to the nature of my project, I'm using my own testing script instead of the provided testing framework in this repo. For completeness, my model loading code looks like this:Please let me know if there's any additional testing (suites or otherwise) I should run, or if there's a contributing guide that I've overlooked. Furthermore, I'm happy to provide more details as needed. Thanks!