Open lx-eotk opened 4 years ago
I think it's just the same as conv2d as it has a 2d kernel. I was also very confused here so checked the pytorch source code, From https://github.com/pytorch/pytorch/blob/eebd492dcfb8d4152a5ba4240486f3df5aaf112d/torch/nn/modules/conv.py#L247 to https://github.com/pytorch/pytorch/blob/eebd492dcfb8d4152a5ba4240486f3df5aaf112d/torch/nn/modules/conv.py#L77, the kernel size is packed into weight shape, which is eventually used in https://github.com/pytorch/pytorch/blob/eebd492dcfb8d4152a5ba4240486f3df5aaf112d/torch/onnx/symbolic_opset9.py#L1068
Passing a (1,2) kernel to conv1d seems the same as pass a (1,2) kernel to conv2d as they trigger the same underlying computation.. (however pass a kernel=2 to conv1d and conv2d will be different as conv2d will repeat it 2 times)
It will be great if the author can answer here about the particular reason to choose conv1d for gate_convs while choosing conv2d for filter_convs...
Have the same confusion.
I also have the same confusion
Hello, Thank you for sharing the code of Graph WaveNet.
I have some confusion about the conv1d function using in your code. Why use conv1d in gate_convs and how to understand that the kernel_size is a tuple (1, 2)? Thanks