szc19990412 / TransMIL

TransMIL: Transformer based Correlated Multiple Instance Learning for Whole Slide Image Classification
363 stars 74 forks source link

Error in NyStromAttention #42

Closed bryanwong17 closed 10 months ago

bryanwong17 commented 10 months ago
class TransLayer(nn.Module):

    def __init__(self, norm_layer=nn.LayerNorm, dim=512):
        super().__init__()
        self.norm = norm_layer(dim)
        self.attn = NystromAttention(
            dim = dim,
            dim_head = dim//8,
            heads = 8,
            num_landmarks = dim//2,    # number of landmarks
            pinv_iterations = 6,    # number of moore-penrose iterations for approximating pinverse. 6 was recommended by the paper
            residual = True,         # whether to do an extra residual with the value or not. supposedly faster convergence if turned on
            dropout=0.1
        )

    def forward(self, x):
        print(self.attn(self.norm(x)).shape)
        x = x + self.attn(self.norm(x))

        return x

RuntimeError: Output 0 of PermuteBackward0 is a view and is being modified inplace. This view is the output of a function that returns multiple views. Such functions do not allow the output views to be modified inplace. You should replace the inplace operation by an out-of-place one.

Hi, I was wondering if you have ever encountered this issue. If so, how did you solve it? Thank you!

dasun07 commented 10 months ago

Hi, I am facing the same issue. Did you manage to solve it? Thank you in advance

bryanwong17 commented 10 months ago

Hi, I managed to solve the problem by changing manually this line of code inside of the package

q *= self. scale

to

q = q * self.scale