I made the following modifications in 'configs/simmim/simmim_pretrainswin_base__img192_window6800ep.yaml', and trained a model with 2.6 billion parameters, which is 400 million less than swin-V2-giant claimed 3 billion. I wonder how many parameters does the official version of swin-V2-giant have? Thanks!
EMBED_DIM: 512
DEPTHS: [ 2, 2, 42, 2 ]
NUM_HEADS: [ 16, 32, 64, 128 ]
I made the following modifications in 'configs/simmim/simmim_pretrainswin_base__img192_window6800ep.yaml', and trained a model with 2.6 billion parameters, which is 400 million less than swin-V2-giant claimed 3 billion. I wonder how many parameters does the official version of swin-V2-giant have? Thanks! EMBED_DIM: 512 DEPTHS: [ 2, 2, 42, 2 ] NUM_HEADS: [ 16, 32, 64, 128 ]