Res2Net / Res2Net-PretrainedModels

(ImageNet pretrained models) The official pytorch implemention of the TPAMI paper "Res2Net: A New Multi-scale Backbone Architecture"
https://mmcheng.net/res2net/
1.07k stars 215 forks source link

About the pretrained model #23

Closed XiaoHao-Chen closed 4 years ago

XiaoHao-Chen commented 4 years ago

Hello, I used res2net provided by you for my own classification task, but there was a strange phenomenon when using the pre trained model parameters provided by you. At the beginning of training, there was a large loss, and the classification accuracy was very low, about 2%. In this case, I have only encountered it on the network that does not use the Imagenet pre training. At the same time, I also tried to set the result of pretrained = False, which is the same as that of pretrained = True. Although my classification program has not been completed, but from the current trend of classification accuracy rising, the potential is limited. (I will reply to your final result when the program is finished.) So I'd like to ask, will the same phenomenon occur in your own classification tasks? Thank you

gasvn commented 4 years ago

If the data in your task is similar to the data in ImageNet (usually natural images), then pretraining will help a lot. On the opposite, the pretraining will have limited help. If the loss is too large, maybe there are sth. wrong in your code (lr, pretraining loading, batchsize...).

XiaoHao-Chen commented 4 years ago

My dataset is different from ImageNet. It's remote sensing data. I was using resnet50 before. When I did not use the resnet50 pre trained by pytorch, there would be a problem of large loss and low accuracy. When I used the network pre trained by pytorch, there would be a lower loss and better classification accuracy. The setting of other parameters has not changed in the two attempts and is similar to the popular setting in the field of remote sensing. I'll continue to look for problems with these settings based on what you said. Thank you very much for your reply.

gasvn commented 4 years ago

In my experience, if the loss doesn't drop and becomes larger, you can try to lower the inital lr.