Closed larsupb closed 9 months ago
+1
I also had this exact issue and got it to import by changing all occurences of @torch.no_grad
to @torch.no_grad()
, there should be 4 .py files to modify
Thank! I mistook no_grad for a decorator function when in fact it is a class, so you have to instantiate it when using it as a decorator. I will patch this up.
In Python, when you use a decorator that is a function, like @some_decorator, you do not need to add parentheses. However, when the decorator is a class or a function that returns a callable (like a class instance), you need to use parentheses, even if there are no arguments to pass.
In the case of @torch.no_grad(), it is used with parentheses because torch.no_grad is not a simple function. It is actually a class that creates a context manager object when called. This object temporarily sets the gradient calculation to off when its enter method is called and restores the original state when its exit method is called.
So, the parentheses are used to instantiate the torch.no_grad class, and the resulting object acts as a decorator. This is why you see @torch.no_grad() rather than @torch.no_grad.
In summary, the parentheses indicate that you are creating an instance of the torch.no_grad class (or calling a factory function), and it's this instance (or the result of the function call) that is acting as the decorator.
Fixed with the latest commit.
First, thanks for your efforts, really appreciate.
Seems the latest changes cause errors with my torch (2.0.1) version. Looks like @torch.no_grad annotation causes this errors.