Closed lf-demo closed 1 year ago
I have noticed that your expansion of pos_emb in class Transformer is different with the shape of moving mnist ? Should it be expanded to (src.size(0), 20, self.d_model, 16, 16) since your patch_size is 2 instead of 4?
Sorry for the question. I have noticed the change of size for tensor in the encoder and decoder.
I have noticed that your expansion of pos_emb in class Transformer is different with the shape of moving mnist ? Should it be expanded to (src.size(0), 20, self.d_model, 16, 16) since your patch_size is 2 instead of 4?