NVlabs / SegFormer

Official PyTorch implementation of SegFormer
https://arxiv.org/abs/2105.15203
Other
2.58k stars 357 forks source link

Clarifying the Pretraining Procedure #120

Closed gauenk closed 1 year ago

gauenk commented 1 year ago

The paper states "We pre-train the encoder on the Imagenet-1K dataset".

Does this mean the encoder is trained a classification task first? If so, is there code for this to share? I can not find it in the repo.

Primarily, I want to be able to reproduce the "mit_*pth" files, either conceptually or with your code.

Thank you!

gauenk commented 1 year ago

116