Closed demo4ai closed 1 year ago
Thanks for your sharing code ! I am currently encountering some issues.
I tried calling the function full_attention_conv
x = torch.randn(25, 4, 16) a = full_attention_conv(x, x, x, kernel='simple', output_attn=True)
It will report an error RuntimeError: The size of tensor a (25) must match the size of tensor b (4) at non-singleton dimension 1
Hi, the reason could be that you set the num_heads=1 while the input x suggests the head number should be 4
Thanks for your sharing code ! I am currently encountering some issues.
I tried calling the function full_attention_conv
x = torch.randn(25, 4, 16) a = full_attention_conv(x, x, x, kernel='simple', output_attn=True)
It will report an error RuntimeError: The size of tensor a (25) must match the size of tensor b (4) at non-singleton dimension 1