Res2Net / Res2Net-PretrainedModels

(ImageNet pretrained models) The official pytorch implemention of the TPAMI paper "Res2Net: A New Multi-scale Backbone Architecture"
https://mmcheng.net/res2net/
1.07k stars 215 forks source link

What is baseWidth? #64

Closed udkii closed 2 years ago

udkii commented 2 years ago

Hello. I have a question about Res2Net.

I don't understand what is the 'baseWidth'. When we make Cottle2neck class and Res2Neet class, I can see baseWidth=26 at definit. And baseWidth is described as basic width of conv3*3.

What does basic width of convolution mean? Isn't the size of a 33 convolution layer 33?

I wonder where 26 comes from. Looking at the size of bn1.bias of Res2Net, torch.Size([104]) is output. This 104 is the value of baseWidth(26)*scale(4). What the heck is this baseWidth?

There seems to be no size 104 in the bottleneck of ResNet50. I need some advices. Can anybody help me?

Thank you.

gasvn commented 2 years ago

baseWidth is a multiplier to make sure the parameters of Res2Net is similar to ResNet.

Dream-ai commented 2 years ago

I also had the same problem

Liqq1 commented 2 years ago

baseWidth is a multiplier to make sure the parameters of Res2Net is similar to ResNet. 您好,我想问一下 /64 再plane 的作用是什么? 为什么不直接 用 26scale? 这只是为了方便与resnet参数量的方法。只要是能对齐参数量,您说的这种也都可以。