Closed Gregory1994 closed 1 year ago
There is slight difference between TinyMIM and TinyMIM-T* including distillation tokens, number of heads, and an extra fully connected layer. The code will be released soon. Please email me if you need the log and the code that I used but is not ready to be published in github for reference.
Thanks for your reply! i've emailed you and looking forward your code and log! I will close this issue.
Hi, thanks for your great work!
I'm interested in your outstanding segentation result in TinyMIM-tiny, and try to reproduce it. But I only get 38.87 mIoU on ade20k dataset, considerably lower than 45.0 mIoU reported in your paper.
I did my experiment on mmsegmentation and use TinyMIM-FT-Tstar.pth as pretrained model.
So would you please tell me:
Best wishes!