Hi!
Can you explain why GAST-Net's dimension is (25, 17, 256) (T, N, C) after executing first Graph Attention Block in the paper?
I think input shape (256, 27, 17) will be (256, 9, 17) after going through Conv2D with kernel (3,1) and stride (3, 1) in gast_net.py
When I print the residual shape in the network, it shows me (256, 9, 17) # (C, T, N).
Hi! Can you explain why GAST-Net's dimension is (25, 17, 256) (T, N, C) after executing first Graph Attention Block in the paper? I think input shape (256, 27, 17) will be (256, 9, 17) after going through Conv2D with kernel (3,1) and stride (3, 1) in gast_net.py When I print the residual shape in the network, it shows me (256, 9, 17) # (C, T, N).
Thanks