however, I get the following error due to incorrect shapes in the matmul operation as [2000,2000] @ [1,2000,43,128] is not allowed.
RuntimeError Traceback (most recent call last)
Input In [56], in <cell line: 1>()
----> 1 layer(features)
File ~/anaconda3/envs/torch_1.11/lib/python3.9/site-packages/torch/nn/modules/module.py:1110, in Module._call_impl(self, *input, **kwargs)
1106 # If we don't have any hooks, we want to skip the rest of the logic in
1107 # this function, and just call forward.
1108 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
1109 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1110 return forward_call(*input, **kwargs)
1111 # Do not call functions when jit is used
1112 full_backward_hooks, non_full_backward_hooks = [], []
Input In [53], in LocalGConv.forward(self, inputs)
49 support_loop_reshape = support_loop.reshape(batch_size, num_pt, num_hypo, -1)
---> 50 output = (self.adj_mat @ support_reshape) + support_loop_reshape
51 if self.bias is not None:
52 ret = output + self.bias
RuntimeError: mat1 and mat2 shapes cannot be multiplied (256000x43 and 2000x2000)
Please let me know if you have any idea of what the fix is
First of all thank you so much for writing this all in Pytorch. I was about to do the same thing myself and you saved me a lot of trouble.
I'm running into an error in the LocalGConv layer and I'm hoping you can help.
however, I get the following error due to incorrect shapes in the matmul operation as [2000,2000] @ [1,2000,43,128] is not allowed.
Please let me know if you have any idea of what the fix is