niais / mv-ignet

The Official PyTorch implementation of "Learning Multi-View Interactional Skeleton Graph for Action Recognition" in TPAMI 2020
BSD 2-Clause "Simplified" License
25 stars 4 forks source link

Einsum can be replaced by conv? #3

Open southsea0725 opened 3 years ago

southsea0725 commented 3 years ago

depth-wise conv

x = torch.einsum('nctv,cvw->nctw', (x, dw_gcn_weight))

point-wise conv

x = torch.einsum('nctw,cd->ndtw', (x, self.pw_gcn_weight))

these can be replaced by conv with groups? belease I want to test speed on phone, so I need to use ncnn / mnn / tnn, thus, need to convert model from pth to onnx, then to ncnn / mnn / tnn, BUT einsum NOT supported by these framework

niais commented 3 years ago

depth-wise conv

x = torch.einsum('nctv,cvw->nctw', (x, dw_gcn_weight))

point-wise conv

x = torch.einsum('nctw,cd->ndtw', (x, self.pw_gcn_weight))

these can be replaced by conv with groups? belease I want to test speed on phone, so I need to use ncnn / mnn / tnn, thus, need to convert model from pth to onnx, then to ncnn / mnn / tnn, BUT einsum NOT supported by these framework

torch.einsum is a simple way to implement complex matrix multiplication. You can simply replace it by matmul with tensor reshape and permutation. 'nctv,cvw->nctw': for each channel in c, you can first reshape two tensors into sizes [nt, v] and [v, w], then perform matmul between them. finally, we get [c, nt, w] -> nctw.