SDS-Lab / ROT-Pooling

Learnable Global Pooling Layers Based on Regularized Optimal Transport (ROT)
MIT License
15 stars 2 forks source link

resnet+rotp #2

Open dyqii opened 4 weeks ago

dyqii commented 4 weeks ago

resnet+rotp我在本地的掌纹数据库中测试,把您的resnet代码#直接实例化ResNet18,不替换任何层,就是把main_worker函数里dim = 512,model = replace_pooling(model, k=args.k, dim=dim, f_method=args.f_method)注释掉之后结果为* Acc@1 96.412 Acc@5 99.265 Epoch: [99][ 0/22] Time 1.013 ( 1.013) Data 0.955 ( 0.955) Loss 3.2381e-01 (3.2381e-01) Acc@1 91.41 ( 91.41) Acc@5 97.27 ( 97.27) Epoch: [99][10/22] Time 0.107 ( 0.238) Data 0.000 ( 0.158) Loss 4.3934e-01 (4.3452e-01) Acc@1 89.45 ( 90.55) Acc@5 95.31 ( 95.99) Epoch: [99][20/22] Time 0.106 ( 0.187) Data 0.035 ( 0.108) Loss 4.0396e-01 (4.2578e-01) Acc@1 90.23 ( 90.92) Acc@5 96.48 ( 96.19) Test: [ 0/10] Time 1.214 ( 1.214) Loss 2.1979e-01 (2.1979e-01) Acc@1 95.70 ( 95.70) Acc@5 98.05 ( 98.05)

minjiecheng commented 3 weeks ago

Thank you for your interest in our work. I would like to confirm whether the above results were obtained after tuning the parameters in ROTP, such as adjusting the feed-forward step.

dyqii commented 3 weeks ago

I did not adjust the pre parameters of Rotp, here is my dataset and code 链接:https://pan.baidu.com/s/1ruXW8gzPkn8ypC_oXHICcA?pwd=judm 提取码:judm --来自百度网盘超级会员V4的分享