Open angryhen opened 4 years ago
Hi: 在使用您代码的过程中,我对比了nn.CrossEntropyLoss()和LabelSmoothingCrossEntropy(),并在每个step打印loss值,得到以下,左边为smoothing loss,右边为nn.CrossEntropyLoss,似乎smoothing的loss稳定且只能以微小变化(0.5~0.6),而右边则还能降得更低。 我在自己写的代码上同样遇到过,请问您有没留意到这个问题?期待您的回复
0.6104838848114014 0.13801012933254242 0.609204888343811 0.14818525314331055 0.6149349212646484 0.1491081565618515 0.6321774125099182 0.17487832903862 0.6126755475997925 0.1317736655473709 0.6100113987922668 0.1340378224849701 0.6236982345581055 0.15228210389614105 0.5979918837547302 0.11493214219808578 0.6044023036956787 0.12213070690631866 0.5954407453536987 0.1088051125407219 Epoch:3 || epochiter: 32/139|| Totel iter 310 || Loss: 0.595441||ACC: 99.219 ||LR: 0.00044660 0.6146756410598755 0.13752762973308563 0.6278829574584961 0.1602025330066681 0.6053824424743652 0.1260662078857422 0.6094704270362854 0.14142774045467377 0.6054157018661499 0.13225990533828735 0.6109800934791565 0.14464853703975677 0.6221442818641663 0.14965230226516724 0.6030154228210449 0.12700901925563812 0.6105899214744568 0.13878554105758667 0.6118856072425842 0.13709238171577454 Epoch:3 || epochiter: 42/139|| Totel iter 320 || Loss: 0.611886||ACC: 99.219 ||LR: 0.00046097 0.6141043305397034 0.15106570720672607 0.6156240701675415 0.16219794750213623 0.6021517515182495 0.1331876516342163 0.5985877513885498 0.12620455026626587 0.6136577129364014 0.1546640545129776 0.6055587530136108 0.12843844294548035 0.6039034128189087 0.13453435897827148 0.600081741809845 0.12076513469219208 0.5962940454483032 0.11982955038547516 0.6148940324783325 0.13918887078762054
不好意思,我没太注意过这个问题,回头试一试
谢谢,期待您的进展
Hi: 在使用您代码的过程中,我对比了nn.CrossEntropyLoss()和LabelSmoothingCrossEntropy(),并在每个step打印loss值,得到以下,左边为smoothing loss,右边为nn.CrossEntropyLoss,似乎smoothing的loss稳定且只能以微小变化(0.5~0.6),而右边则还能降得更低。 我在自己写的代码上同样遇到过,请问您有没留意到这个问题?期待您的回复