Open MeaninglessAI opened 5 years ago
Hi
when I trained the model, loss lingers around 1.2 all the time. I wonder how much loss will be after convergence when you are training the model.
when I used your model, the best performance is:
best-rank1: 0.76 mAP: 0.66
so, I worry about the model actually don't converge.
thank you so much.
==> Epoch 151/800 Batch 5/12 Loss 1.151732 (1.184221) Batch 10/12 Loss 1.185098 (1.208741) ==> Epoch 152/800 Batch 5/12 Loss 1.195779 (1.248034) Batch 10/12 Loss 1.187773 (1.223771) ==> Epoch 153/800 Batch 5/12 Loss 1.199124 (1.226188) Batch 10/12 Loss 1.244340 (1.216574) ==> Epoch 154/800 Batch 5/12 Loss 1.171001 (1.206451) Batch 10/12 Loss 1.234953 (1.210964) ==> Epoch 155/800 Batch 5/12 Loss 1.171692 (1.175456) Batch 10/12 Loss 1.286676 (1.226450) ==> Epoch 156/800 Batch 5/12 Loss 1.176779 (1.190393) Batch 10/12 Loss 1.375999 (1.226623) ==> Epoch 157/800 Batch 5/12 Loss 1.252197 (1.201436) Batch 10/12 Loss 1.162636 (1.229630) ==> Epoch 158/800 Batch 5/12 Loss 1.189308 (1.214950) Batch 10/12 Loss 1.181550 (1.208582) ==> Epoch 159/800 Batch 5/12 Loss 1.128059 (1.146427) Batch 10/12 Loss 1.189266 (1.164883) ==> Epoch 160/800 Batch 5/12 Loss 1.303839 (1.234521) Batch 10/12 Loss 1.246772 (1.249348) ==> Epoch 161/800 Batch 5/12 Loss 1.279776 (1.200658) Batch 10/12 Loss 1.450521 (1.241791) ==> Epoch 162/800 Batch 5/12 Loss 1.182487 (1.217111) Batch 10/12 Loss 1.222181 (1.232863) ==> Epoch 163/800 Batch 5/12 Loss 1.215730 (1.204643) Batch 10/12 Loss 1.201601 (1.217562) ==> Epoch 164/800 Batch 5/12 Loss 1.250196 (1.210316) Batch 10/12 Loss 1.244857 (1.207420) ==> Epoch 165/800 Batch 5/12 Loss 1.213206 (1.162842) Batch 10/12 Loss 1.271064 (1.197076) ==> Epoch 166/800 Batch 5/12 Loss 1.193753 (1.167730) Batch 10/12 Loss 1.221726 (1.206353) ==> Epoch 167/800 Batch 5/12 Loss 1.202258 (1.226315) Batch 10/12 Loss 1.281449 (1.228988) ==> Epoch 168/800 Batch 5/12 Loss 1.234186 (1.282551) Batch 10/12 Loss 1.286633 (1.289782) ==> Epoch 169/800 Batch 5/12 Loss 1.176978 (1.193318) Batch 10/12 Loss 1.212864 (1.219412) ==> Epoch 170/800 Batch 5/12 Loss 1.220609 (1.195325) Batch 10/12 Loss 1.191236 (1.221831) ==> Epoch 171/800 Batch 5/12 Loss 1.377638 (1.262232) Batch 10/12 Loss 1.282985 (1.233217) ==> Epoch 172/800 Batch 5/12 Loss 1.161230 (1.185284) Batch 10/12 Loss 1.184502 (1.199786) ==> Epoch 173/800 Batch 5/12 Loss 1.168909 (1.182812) Batch 10/12 Loss 1.270397 (1.197942) ==> Epoch 174/800 Batch 5/12 Loss 1.204447 (1.163887) Batch 10/12 Loss 1.176649 (1.187690) ==> Epoch 175/800 Batch 5/12 Loss 1.198594 (1.184971) Batch 10/12 Loss 1.302678 (1.217098) ==> Epoch 176/800 Batch 5/12 Loss 1.255241 (1.176709) Batch 10/12 Loss 1.314871 (1.217345) ==> Epoch 177/800 Batch 5/12 Loss 1.457606 (1.255013) Batch 10/12 Loss 1.256773 (1.233927) ==> Epoch 178/800 Batch 5/12 Loss 1.180098 (1.210582) Batch 10/12 Loss 1.198249 (1.219975) ==> Epoch 179/800 Batch 5/12 Loss 1.415169 (1.233042) Batch 10/12 Loss 1.247434 (1.268601) ==> Epoch 180/800 Batch 5/12 Loss 1.294599 (1.236518) Batch 10/12 Loss 1.276755 (1.259600) ==> Epoch 181/800 Batch 5/12 Loss 1.345576 (1.215330) Batch 10/12 Loss 1.286063 (1.239036) ==> Epoch 182/800 Batch 5/12 Loss 1.205686 (1.186106) Batch 10/12 Loss 1.180740 (1.196601) ==> Epoch 183/800 Batch 5/12 Loss 1.212626 (1.198935) Batch 10/12 Loss 1.218805 (1.222616) ==> Epoch 184/800 Batch 5/12 Loss 1.262955 (1.225434) Batch 10/12 Loss 1.270376 (1.229889) ==> Epoch 185/800 Batch 5/12 Loss 1.184287 (1.197745) Batch 10/12 Loss 1.258302 (1.225032) ==> Epoch 186/800 Batch 5/12 Loss 1.210953 (1.191885) Batch 10/12 Loss 1.270131 (1.197497) ==> Epoch 187/800 Batch 5/12 Loss 1.239055 (1.209134) Batch 10/12 Loss 1.215639 (1.213043) ==> Epoch 188/800 Batch 5/12 Loss 1.246695 (1.194201) Batch 10/12 Loss 1.292399 (1.216818) ==> Epoch 189/800 Batch 5/12 Loss 1.197155 (1.217120) Batch 10/12 Loss 1.179686 (1.225153) ==> Epoch 190/800 Batch 5/12 Loss 1.154700 (1.212153) Batch 10/12 Loss 1.190199 (1.214623) ==> Epoch 191/800 Batch 5/12 Loss 1.315284 (1.203522) Batch 10/12 Loss 1.280430 (1.217727) ==> Epoch 192/800 Batch 5/12 Loss 1.147254 (1.190746) Batch 10/12 Loss 1.161241 (1.193478) ==> Epoch 193/800 Batch 5/12 Loss 1.260765 (1.196339) Batch 10/12 Loss 1.136279 (1.181638) ==> Epoch 194/800 Batch 5/12 Loss 1.167764 (1.230083) Batch 10/12 Loss 1.203083 (1.219795) ==> Epoch 195/800 Batch 5/12 Loss 1.213726 (1.254149) Batch 10/12 Loss 1.210874 (1.226763) ==> Epoch 196/800 Batch 5/12 Loss 1.157766 (1.199533) Batch 10/12 Loss 1.231522 (1.201280) ==> Epoch 197/800 Batch 5/12 Loss 1.145700 (1.159924) Batch 10/12 Loss 1.217924 (1.186239) ==> Epoch 198/800 Batch 5/12 Loss 1.230108 (1.180395) Batch 10/12 Loss 1.216664 (1.208750) ==> Epoch 199/800 Batch 5/12 Loss 1.188747 (1.171446) Batch 10/12 Loss 1.129958 (1.195133) ==> Epoch 200/800
Hi
when I trained the model, loss lingers around 1.2 all the time. I wonder how much loss will be after convergence when you are training the model.
when I used your model, the best performance is:
best-rank1: 0.76 mAP: 0.66
so, I worry about the model actually don't converge.
thank you so much.
loss log:
==> Epoch 151/800 Batch 5/12 Loss 1.151732 (1.184221) Batch 10/12 Loss 1.185098 (1.208741) ==> Epoch 152/800 Batch 5/12 Loss 1.195779 (1.248034) Batch 10/12 Loss 1.187773 (1.223771) ==> Epoch 153/800 Batch 5/12 Loss 1.199124 (1.226188) Batch 10/12 Loss 1.244340 (1.216574) ==> Epoch 154/800 Batch 5/12 Loss 1.171001 (1.206451) Batch 10/12 Loss 1.234953 (1.210964) ==> Epoch 155/800 Batch 5/12 Loss 1.171692 (1.175456) Batch 10/12 Loss 1.286676 (1.226450) ==> Epoch 156/800 Batch 5/12 Loss 1.176779 (1.190393) Batch 10/12 Loss 1.375999 (1.226623) ==> Epoch 157/800 Batch 5/12 Loss 1.252197 (1.201436) Batch 10/12 Loss 1.162636 (1.229630) ==> Epoch 158/800 Batch 5/12 Loss 1.189308 (1.214950) Batch 10/12 Loss 1.181550 (1.208582) ==> Epoch 159/800 Batch 5/12 Loss 1.128059 (1.146427) Batch 10/12 Loss 1.189266 (1.164883) ==> Epoch 160/800 Batch 5/12 Loss 1.303839 (1.234521) Batch 10/12 Loss 1.246772 (1.249348) ==> Epoch 161/800 Batch 5/12 Loss 1.279776 (1.200658) Batch 10/12 Loss 1.450521 (1.241791) ==> Epoch 162/800 Batch 5/12 Loss 1.182487 (1.217111) Batch 10/12 Loss 1.222181 (1.232863) ==> Epoch 163/800 Batch 5/12 Loss 1.215730 (1.204643) Batch 10/12 Loss 1.201601 (1.217562) ==> Epoch 164/800 Batch 5/12 Loss 1.250196 (1.210316) Batch 10/12 Loss 1.244857 (1.207420) ==> Epoch 165/800 Batch 5/12 Loss 1.213206 (1.162842) Batch 10/12 Loss 1.271064 (1.197076) ==> Epoch 166/800 Batch 5/12 Loss 1.193753 (1.167730) Batch 10/12 Loss 1.221726 (1.206353) ==> Epoch 167/800 Batch 5/12 Loss 1.202258 (1.226315) Batch 10/12 Loss 1.281449 (1.228988) ==> Epoch 168/800 Batch 5/12 Loss 1.234186 (1.282551) Batch 10/12 Loss 1.286633 (1.289782) ==> Epoch 169/800 Batch 5/12 Loss 1.176978 (1.193318) Batch 10/12 Loss 1.212864 (1.219412) ==> Epoch 170/800 Batch 5/12 Loss 1.220609 (1.195325) Batch 10/12 Loss 1.191236 (1.221831) ==> Epoch 171/800 Batch 5/12 Loss 1.377638 (1.262232) Batch 10/12 Loss 1.282985 (1.233217) ==> Epoch 172/800 Batch 5/12 Loss 1.161230 (1.185284) Batch 10/12 Loss 1.184502 (1.199786) ==> Epoch 173/800 Batch 5/12 Loss 1.168909 (1.182812) Batch 10/12 Loss 1.270397 (1.197942) ==> Epoch 174/800 Batch 5/12 Loss 1.204447 (1.163887) Batch 10/12 Loss 1.176649 (1.187690) ==> Epoch 175/800 Batch 5/12 Loss 1.198594 (1.184971) Batch 10/12 Loss 1.302678 (1.217098) ==> Epoch 176/800 Batch 5/12 Loss 1.255241 (1.176709) Batch 10/12 Loss 1.314871 (1.217345) ==> Epoch 177/800 Batch 5/12 Loss 1.457606 (1.255013) Batch 10/12 Loss 1.256773 (1.233927) ==> Epoch 178/800 Batch 5/12 Loss 1.180098 (1.210582) Batch 10/12 Loss 1.198249 (1.219975) ==> Epoch 179/800 Batch 5/12 Loss 1.415169 (1.233042) Batch 10/12 Loss 1.247434 (1.268601) ==> Epoch 180/800 Batch 5/12 Loss 1.294599 (1.236518) Batch 10/12 Loss 1.276755 (1.259600) ==> Epoch 181/800 Batch 5/12 Loss 1.345576 (1.215330) Batch 10/12 Loss 1.286063 (1.239036) ==> Epoch 182/800 Batch 5/12 Loss 1.205686 (1.186106) Batch 10/12 Loss 1.180740 (1.196601) ==> Epoch 183/800 Batch 5/12 Loss 1.212626 (1.198935) Batch 10/12 Loss 1.218805 (1.222616) ==> Epoch 184/800 Batch 5/12 Loss 1.262955 (1.225434) Batch 10/12 Loss 1.270376 (1.229889) ==> Epoch 185/800 Batch 5/12 Loss 1.184287 (1.197745) Batch 10/12 Loss 1.258302 (1.225032) ==> Epoch 186/800 Batch 5/12 Loss 1.210953 (1.191885) Batch 10/12 Loss 1.270131 (1.197497) ==> Epoch 187/800 Batch 5/12 Loss 1.239055 (1.209134) Batch 10/12 Loss 1.215639 (1.213043) ==> Epoch 188/800 Batch 5/12 Loss 1.246695 (1.194201) Batch 10/12 Loss 1.292399 (1.216818) ==> Epoch 189/800 Batch 5/12 Loss 1.197155 (1.217120) Batch 10/12 Loss 1.179686 (1.225153) ==> Epoch 190/800 Batch 5/12 Loss 1.154700 (1.212153) Batch 10/12 Loss 1.190199 (1.214623) ==> Epoch 191/800 Batch 5/12 Loss 1.315284 (1.203522) Batch 10/12 Loss 1.280430 (1.217727) ==> Epoch 192/800 Batch 5/12 Loss 1.147254 (1.190746) Batch 10/12 Loss 1.161241 (1.193478) ==> Epoch 193/800 Batch 5/12 Loss 1.260765 (1.196339) Batch 10/12 Loss 1.136279 (1.181638) ==> Epoch 194/800 Batch 5/12 Loss 1.167764 (1.230083) Batch 10/12 Loss 1.203083 (1.219795) ==> Epoch 195/800 Batch 5/12 Loss 1.213726 (1.254149) Batch 10/12 Loss 1.210874 (1.226763) ==> Epoch 196/800 Batch 5/12 Loss 1.157766 (1.199533) Batch 10/12 Loss 1.231522 (1.201280) ==> Epoch 197/800 Batch 5/12 Loss 1.145700 (1.159924) Batch 10/12 Loss 1.217924 (1.186239) ==> Epoch 198/800 Batch 5/12 Loss 1.230108 (1.180395) Batch 10/12 Loss 1.216664 (1.208750) ==> Epoch 199/800 Batch 5/12 Loss 1.188747 (1.171446) Batch 10/12 Loss 1.129958 (1.195133) ==> Epoch 200/800