Open jaytimbadia opened 2 years ago
I've had the same problem.
Here is another example of a custom autograd. I'm currently rewriting this function myself. https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html
Or maybe everyone encountering this problem should be using this branch instead https://github.com/Vichoko/pytorch-wavenet Since ConstantPad1D is now built into PyTorch https://pytorch.org/docs/stable/generated/torch.nn.ConstantPad1d.html
were you able to solve? I tried the other branch but I get the same error
the same problem that we all meet has been solved by following steps:
by the way, i run the wavenet_demo successfully. the env list is as follow: torch: 1.8.2+cu111 python: 3.7
嘿,
你能帮忙吗?
我面临此功能的运行时错误。
`` 类 ConstantPad1d(Function): def init (self, target_size, dimension=0, value=0, pad_start=False): super(ConstantPad1d, self)。init () self.target_size = target_size self.dimension = 维度 self.value = 值 self.pad_start = pad_start
def forward(self, input): self.num_pad = self.target_size - input.size(self.dimension) assert self.num_pad >= 0, 'target size has to be greater than input size' self.input_size = input.size() size = list(input.size()) size[self.dimension] = self.target_size output = input.new(*tuple(size)).fill_(self.value) c_output = output # crop output if self.pad_start: c_output = c_output.narrow(self.dimension, self.num_pad, c_output.size(self.dimension) - self.num_pad) else: c_output = c_output.narrow(self.dimension, 0, c_output.size(self.dimension) - self.num_pad) c_output.copy_(input) return output def backward(self, grad_output): grad_input = grad_output.new(*self.input_size).zero_() cg_output = grad_output # crop grad_output if self.pad_start: cg_output = cg_output.narrow(self.dimension, self.num_pad, cg_output.size(self.dimension) - self.num_pad) else: cg_output = cg_output.narrow(self.dimension, 0, cg_output.size(self.dimension) - self.num_pad) grad_input.copy_(cg_output) return grad_input
`` RuntimeError:不推荐使用具有非静态转发方法的旧版 autograd 函数。请使用带有静态转发方法的新型 autograd 函数。(例如:https ://pytorch.org/docs/stable/autograd.html#torch.autograd.Function ) 你能帮忙吗?
see the lasted comment
你能解决吗?我尝试了另一个分支,但我得到了同样的错误
can refer my comment
Hey,
Can you please help?
I am facing Runtime error for this function.
`` class ConstantPad1d(Function): def init(self, target_size, dimension=0, value=0, pad_start=False): super(ConstantPad1d, self).init() self.target_size = target_size self.dimension = dimension self.value = value self.pad_start = pad_start
`` RuntimeError: Legacy autograd function with non-static forward method is deprecated. Please use new-style autograd function with static forward method. (Example: https://pytorch.org/docs/stable/autograd.html#torch.autograd.Function) Can yo please help?