zhihou7 / BatchFormer

CVPR2022, BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning, https://arxiv.org/abs/2203.01522
246 stars 20 forks source link

Batchformer application issues #28

Closed thisisqiaoqiao closed 9 months ago

thisisqiaoqiao commented 10 months ago

Hello, your work is very useful. After reading the paper, can I think of Batchformer as a module to achieve mutual learning of samples in each Batchsize during training? If this is the case, does it mean that the following code can be inserted into any learning task as a module to learn the relationship between samples?

return x, y
def BatchFormer(x, y, encoder, is_training):
    # x: input features with the shape [N, C]
    # encoder: TransformerEncoderLayer(C,4,C,0.5)
    if not is_training:
        return x, y
    pre_x = x
    x = encoder(x.unsqueeze(1)).squeeze(1)
    x = torch.cat([pre_x, x], dim=0)
    y = torch.cat([y, y], dim=0)
zhihou7 commented 9 months ago

Yes, that's our target.

thisisqiaoqiao commented 9 months ago

Thank you for your prompt reply, and I have another question. #29