GANPerf / SAM

29 stars 7 forks source link

What model was used as a regular ResNet-50, mentioned in the paper as "Fine-Tuning" ? #2

Open demidovd98 opened 1 year ago

demidovd98 commented 1 year ago

Hello, thanks for the code

For some reason, I can't reproduce the accuracy for the usual ResNet-50 (without SAM). With 10 % of the data I'm only getting about 32-33 % (while in the paper it is 37%), but for 15/30/50/100 % the results are pretty close to the paper.

Could you please specify what model exactly was used as a regular ResNet-50 (mentioned in the paper as "Fine-Tuning") ? Was it a conventional full ResNet-50 simply fine-tuned, or was the classifier trained on top of the features from the 4th conv layer (projected onto 2048-d) ?

Thank you.

GANPerf commented 1 year ago

Hello, thanks for the code

For some reason, I can't reproduce the accuracy for the usual ResNet-50 (without SAM). With 10 % of the data I'm only getting about 32-33 % (while in the paper it is 37%), but for 15/30/50/100 % the results are pretty close to the paper.

Could you please specify what model exactly was used as a regular ResNet-50 (mentioned in the paper as "Fine-Tuning") ? Was it a conventional full ResNet-50 simply fine-tuned, or was the classifier trained on top of the features from the 4th conv layer (projected onto 2048-d) ?

Thank you.

Hi guys, thank you for your interest. "Fine-tuning" means training the whole network, including the resnet 50 backbone and the classifier. With 10 % of the data u are only getting about 32-33 %, maybe u need to check whether u freeze the backbone and only train the classifier, which will results in low results. Hope it is helpful to you.