Would you be able to write an example with dummy data for the outlookattention module's forward pass? I am trying to practice, and understand, each step that is taken but I cannot get a reproduction of the forward pass given the code in models/volo.py. Any help would be much appreciated.
Something like this but for outlook attention instead of a transformer.
tm = nn.Transformer(nhead = 16, num_encoder_layers = 12)
src = torch.rand((10, 32, 512))
tgt = torch.rand((20, 32, 512))
tm(src, tgt)
Would you be able to write an example with dummy data for the outlookattention module's forward pass? I am trying to practice, and understand, each step that is taken but I cannot get a reproduction of the forward pass given the code in models/volo.py. Any help would be much appreciated.
Something like this but for outlook attention instead of a transformer. tm = nn.Transformer(nhead = 16, num_encoder_layers = 12) src = torch.rand((10, 32, 512)) tgt = torch.rand((20, 32, 512)) tm(src, tgt)