kyegomez / VisionMamba

Implementation of Vision Mamba from the paper: "Vision Mamba: Efficient Visual Representation Learning with Bidirectional State Space Model" It's 2.8x faster than DeiT and saves 86.8% GPU memory when performing batch inference to extract features on high-res images
https://discord.gg/GYbXvDGevY
MIT License
303 stars 14 forks source link

Matrix dimensions do not match #24

Open yanyu369 opened 1 week ago

yanyu369 commented 1 week ago

As you can see, I am currently facing a problem. I modified the Transformer part of my code using the code you provided, but it shows that the dimensions of my matrix multiplication do not match. I hope you can solve my doubts.There is a dimension mismatch in the multiplication of x1_ssm and x2_ssm with z. At the same time, I am still confused. Doesn't vim say that mamba blocks are of two types, front and back? Why didn't I see this in the code you provided?

    def forward_temporal(self, x,F):
        B, J, C = x.shape

        # Skip connection
        skip = x      

        # Normalization
        x = self.norm(x)

        # Split x into x1 and x2 with linears
        z1 = self.proj_x(x)   
        x1 = self.proj_z(x)

        # forward 
        x1 = x1.reshape(B,C,J)
        x1_rearranged = self.softplus(x1)
        forward_conv_output = self.forward_conv1d(x1_rearranged)
        forward_conv_output = forward_conv_output.reshape(B,J,C)
        x1_ssm = self.forward_ssm(forward_conv_output)

        # backward 
        x2 = x1.reshape(B,C,J)
        x2_rearranged = self.softplus(x2)
        backward_conv_output = self.backward_conv1d(x2_rearranged)
        backward_conv_output = backward_conv_output.reshape(B,J,C)
        x2_ssm = self.backward_ssm(backward_conv_output)

        # Activation
        z = self.activation(z1)

        # matmul with z + backward ssm
        x2 = x2_ssm * z

        # Matmul with z and x1
        x1 = x1_ssm * z

        # Add both matmuls
        x = x1 + x2

        # Add skip connection
        return x + skip
github-actions[bot] commented 1 week ago

Hello there, thank you for opening an Issue ! 🙏🏻 The team was notified and they will get back to you asap.