lumina37 / rotate-captcha-crack

CNN预测图片旋转角✨可用于破解旋转验证码
The Unlicense
297 stars 85 forks source link

RotNetR验证集的loss降不下来 #20

Open hacktmz opened 1 year ago

hacktmz commented 1 year ago

使用训练参数 lr = 0.01 momentum = 0.9 epochs = 300 steps = 128 训练集:验证集70%:30% 总共约1100张去白边钱币

素材预览

image

测试角度都是错的,该如何优化

<2023-08-18 13:57:37.784> [INFO] Epoch#0. time_cost: 32.89 s. train_loss: 5.39984232. val_loss: 5.36462307 <2023-08-18 13:57:55.110> [INFO] Epoch#1. time_cost: 48.94 s. train_loss: 5.35185540. val_loss: 5.38960536 <2023-08-18 13:58:11.490> [INFO] Epoch#2. time_cost: 64.82 s. train_loss: 5.28306431. val_loss: 5.41948541 <2023-08-18 13:58:28.066> [INFO] Epoch#3. time_cost: 80.73 s. train_loss: 5.21700990. val_loss: 5.41810258 <2023-08-18 13:58:44.535> [INFO] Epoch#4. time_cost: 96.61 s. train_loss: 5.15511119. val_loss: 5.41366911 <2023-08-18 13:59:01.068> [INFO] Epoch#5. time_cost: 112.57 s. train_loss: 5.06084383. val_loss: 5.40824318 <2023-08-18 13:59:17.673> [INFO] Epoch#6. time_cost: 128.55 s. train_loss: 4.97198361. val_loss: 5.40104500 <2023-08-18 13:59:34.405> [INFO] Epoch#7. time_cost: 144.58 s. train_loss: 4.89924210. val_loss: 5.39258560 <2023-08-18 13:59:51.164> [INFO] Epoch#8. time_cost: 160.73 s. train_loss: 4.81734937. val_loss: 5.38335117 <2023-08-18 14:00:08.268> [INFO] Epoch#9. time_cost: 177.25 s. train_loss: 4.71686673. val_loss: 5.37633689 <2023-08-18 14:00:24.879> [INFO] Epoch#10. time_cost: 193.28 s. train_loss: 4.65475571. val_loss: 5.37279606 <2023-08-18 14:00:41.440> [INFO] Epoch#11. time_cost: 209.35 s. train_loss: 4.55467379. val_loss: 5.36483145 <2023-08-18 14:00:58.116> [INFO] Epoch#12. time_cost: 225.54 s. train_loss: 4.47352850. val_loss: 5.35914739 <2023-08-18 14:01:15.269> [INFO] Epoch#13. time_cost: 241.79 s. train_loss: 4.41723913. val_loss: 5.35702181 <2023-08-18 14:01:32.629> [INFO] Epoch#14. time_cost: 258.01 s. train_loss: 4.31735170. val_loss: 5.35660950 <2023-08-18 14:01:49.672> [INFO] Epoch#15. time_cost: 274.19 s. train_loss: 4.23688066. val_loss: 5.35307089 <2023-08-18 14:02:06.594> [INFO] Epoch#16. time_cost: 290.31 s. train_loss: 4.18479526. val_loss: 5.35310698 <2023-08-18 14:02:23.423> [INFO] Epoch#17. time_cost: 306.63 s. train_loss: 4.08692378. val_loss: 5.34825500 <2023-08-18 14:02:40.599> [INFO] Epoch#18. time_cost: 322.81 s. train_loss: 4.01058915. val_loss: 5.34844208 <2023-08-18 14:02:57.059> [INFO] Epoch#19. time_cost: 338.78 s. train_loss: 3.92014989. val_loss: 5.34933837 <2023-08-18 14:03:13.625> [INFO] Epoch#20. time_cost: 354.79 s. train_loss: 3.84989792. val_loss: 5.34641616 <2023-08-18 14:03:30.806> [INFO] Epoch#21. time_cost: 371.08 s. train_loss: 3.75730199. val_loss: 5.34646304 <2023-08-18 14:03:47.440> [INFO] Epoch#22. time_cost: 387.23 s. train_loss: 3.69406390. val_loss: 5.34865459 <2023-08-18 14:04:04.430> [INFO] Epoch#23. time_cost: 403.58 s. train_loss: 3.58468822. val_loss: 5.34718132 <2023-08-18 14:04:21.192> [INFO] Epoch#24. time_cost: 419.76 s. train_loss: 3.53524327. val_loss: 5.35206652 <2023-08-18 14:04:37.991> [INFO] Epoch#25. time_cost: 435.94 s. train_loss: 3.43011984. val_loss: 5.34888013 <2023-08-18 14:04:54.962> [INFO] Epoch#26. time_cost: 452.26 s. train_loss: 3.37779093. val_loss: 5.34560808 <2023-08-18 14:05:11.900> [INFO] Epoch#27. time_cost: 468.42 s. train_loss: 3.25674978. val_loss: 5.34464852 <2023-08-18 14:05:28.939> [INFO] Epoch#28. time_cost: 484.61 s. train_loss: 3.21313724. val_loss: 5.34361029 <2023-08-18 14:05:46.150> [INFO] Epoch#29. time_cost: 500.94 s. train_loss: 3.11788267. val_loss: 5.34200462 <2023-08-18 14:06:03.293> [INFO] Epoch#30. time_cost: 517.17 s. train_loss: 3.03583649. val_loss: 5.34520737 <2023-08-18 14:06:20.056> [INFO] Epoch#31. time_cost: 533.36 s. train_loss: 3.02078411. val_loss: 5.34665251 <2023-08-18 14:06:36.799> [INFO] Epoch#32. time_cost: 549.56 s. train_loss: 2.89035311. val_loss: 5.34966660 <2023-08-18 14:06:53.613> [INFO] Epoch#33. time_cost: 565.89 s. train_loss: 2.80607820. val_loss: 5.34706863 <2023-08-18 14:07:10.240> [INFO] Epoch#34. time_cost: 582.04 s. train_loss: 2.75119737. val_loss: 5.34983921 <2023-08-18 14:07:27.220> [INFO] Epoch#35. time_cost: 598.46 s. train_loss: 2.66276371. val_loss: 5.34948270 <2023-08-18 14:07:43.911> [INFO] Epoch#36. time_cost: 614.61 s. train_loss: 2.62098613. val_loss: 5.34578753 <2023-08-18 14:08:00.852> [INFO] Epoch#37. time_cost: 631.00 s. train_loss: 2.56218195. val_loss: 5.35035912 <2023-08-18 14:08:17.749> [INFO] Epoch#38. time_cost: 647.41 s. train_loss: 2.44102857. val_loss: 5.35140228 <2023-08-18 14:08:34.332> [INFO] Epoch#39. time_cost: 663.49 s. train_loss: 2.41077507. val_loss: 5.35149717 <2023-08-18 14:08:50.979> [INFO] Epoch#40. time_cost: 679.53 s. train_loss: 2.29778108. val_loss: 5.35420863 <2023-08-18 14:09:07.607> [INFO] Epoch#41. time_cost: 695.53 s. train_loss: 2.24829721. val_loss: 5.35726897 <2023-08-18 14:09:24.769> [INFO] Epoch#42. time_cost: 712.05 s. train_loss: 2.17894366. val_loss: 5.35133410 <2023-08-18 14:09:41.792> [INFO] Epoch#43. time_cost: 728.57 s. train_loss: 2.12200819. val_loss: 5.35315466 <2023-08-18 14:09:58.563> [INFO] Epoch#44. time_cost: 744.72 s. train_loss: 2.07205370. val_loss: 5.34872564 <2023-08-18 14:10:15.165> [INFO] Epoch#45. time_cost: 760.83 s. train_loss: 1.99733144. val_loss: 5.34793647 <2023-08-18 14:10:32.038> [INFO] Epoch#46. time_cost: 777.10 s. train_loss: 1.95925856. val_loss: 5.34805107 <2023-08-18 14:10:49.068> [INFO] Epoch#47. time_cost: 793.60 s. train_loss: 1.88353942. val_loss: 5.34884135 <2023-08-18 14:11:05.784> [INFO] Epoch#48. time_cost: 809.81 s. train_loss: 1.83938010. val_loss: 5.35141436 <2023-08-18 14:11:22.648> [INFO] Epoch#49. time_cost: 826.09 s. train_loss: 1.79292178. val_loss: 5.34627692 <2023-08-18 14:11:39.304> [INFO] Epoch#50. time_cost: 842.29 s. train_loss: 1.68158425. val_loss: 5.34675503 <2023-08-18 14:11:55.872> [INFO] Epoch#51. time_cost: 858.40 s. train_loss: 1.62872086. val_loss: 5.34923681 <2023-08-18 14:12:12.628> [INFO] Epoch#52. time_cost: 874.63 s. train_loss: 1.59070106. val_loss: 5.35235278 <2023-08-18 14:12:29.318> [INFO] Epoch#53. time_cost: 890.81 s. train_loss: 1.58161676. val_loss: 5.35133855 <2023-08-18 14:12:46.143> [INFO] Epoch#54. time_cost: 907.11 s. train_loss: 1.45260657. val_loss: 5.34634527 <2023-08-18 14:13:02.999> [INFO] Epoch#55. time_cost: 923.48 s. train_loss: 1.47507057. val_loss: 5.34150346 <2023-08-18 14:13:20.303> [INFO] Epoch#56. time_cost: 939.80 s. train_loss: 1.38150918. val_loss: 5.34167035 <2023-08-18 14:13:36.953> [INFO] Epoch#57. time_cost: 955.96 s. train_loss: 1.36797532. val_loss: 5.33937915 <2023-08-18 14:13:54.243> [INFO] Epoch#58. time_cost: 972.27 s. train_loss: 1.28458185. val_loss: 5.34360949 <2023-08-18 14:14:11.088> [INFO] Epoch#59. time_cost: 988.53 s. train_loss: 1.24424957. val_loss: 5.34156942 <2023-08-18 14:14:27.971> [INFO] Epoch#60. time_cost: 1004.85 s. train_loss: 1.18852963. val_loss: 5.33656184 <2023-08-18 14:14:44.796> [INFO] Epoch#61. time_cost: 1020.81 s. train_loss: 1.13654406. val_loss: 5.33123016 <2023-08-18 14:15:02.092> [INFO] Epoch#62. time_cost: 1037.30 s. train_loss: 1.11281125. val_loss: 5.33135033 <2023-08-18 14:15:19.106> [INFO] Epoch#63. time_cost: 1053.65 s. train_loss: 1.05972394. val_loss: 5.33335686 <2023-08-18 14:15:35.944> [INFO] Epoch#64. time_cost: 1069.97 s. train_loss: 1.02143451. val_loss: 5.33008957 <2023-08-18 14:15:52.917> [INFO] Epoch#65. time_cost: 1086.13 s. train_loss: 1.01201259. val_loss: 5.32418426 <2023-08-18 14:16:10.429> [INFO] Epoch#66. time_cost: 1102.75 s. train_loss: 0.97423361. val_loss: 5.32560396 <2023-08-18 14:16:26.975> [INFO] Epoch#67. time_cost: 1118.77 s. train_loss: 0.98155708. val_loss: 5.32422972 <2023-08-18 14:16:43.738> [INFO] Epoch#68. time_cost: 1134.93 s. train_loss: 0.89851645. val_loss: 5.32218838 <2023-08-18 14:17:00.692> [INFO] Epoch#69. time_cost: 1151.01 s. train_loss: 0.90840406. val_loss: 5.32185570 <2023-08-18 14:17:17.757> [INFO] Epoch#70. time_cost: 1167.22 s. train_loss: 0.84972844. val_loss: 5.31999524 <2023-08-18 14:17:34.972> [INFO] Epoch#71. time_cost: 1183.57 s. train_loss: 0.82386800. val_loss: 5.31885560 <2023-08-18 14:17:52.144> [INFO] Epoch#72. time_cost: 1199.80 s. train_loss: 0.79325682. val_loss: 5.32391421 <2023-08-18 14:18:09.007> [INFO] Epoch#73. time_cost: 1216.09 s. train_loss: 0.75515443. val_loss: 5.32378785 <2023-08-18 14:18:25.720> [INFO] Epoch#74. time_cost: 1232.24 s. train_loss: 0.71768803. val_loss: 5.31845586 <2023-08-18 14:18:42.934> [INFO] Epoch#75. time_cost: 1248.52 s. train_loss: 0.74559616. val_loss: 5.31854550 <2023-08-18 14:18:59.665> [INFO] Epoch#76. time_cost: 1264.72 s. train_loss: 0.68202233. val_loss: 5.31668854 <2023-08-18 14:19:16.607> [INFO] Epoch#77. time_cost: 1280.92 s. train_loss: 0.65233253. val_loss: 5.31342141 <2023-08-18 14:19:33.557> [INFO] Epoch#78. time_cost: 1296.93 s. train_loss: 0.64974236. val_loss: 5.30892277 <2023-08-18 14:19:50.504> [INFO] Epoch#79. time_cost: 1313.06 s. train_loss: 0.58420439. val_loss: 5.30771160 <2023-08-18 14:20:07.773> [INFO] Epoch#80. time_cost: 1329.33 s. train_loss: 0.60893153. val_loss: 5.30896378 <2023-08-18 14:20:24.518> [INFO] Epoch#81. time_cost: 1345.57 s. train_loss: 0.57614258. val_loss: 5.30884250 <2023-08-18 14:20:41.130> [INFO] Epoch#82. time_cost: 1361.64 s. train_loss: 0.53065565. val_loss: 5.30851078 <2023-08-18 14:20:58.022> [INFO] Epoch#83. time_cost: 1377.96 s. train_loss: 0.55741334. val_loss: 5.30744998 <2023-08-18 14:21:14.904> [INFO] Epoch#84. time_cost: 1393.96 s. train_loss: 0.48479454. val_loss: 5.30663506 <2023-08-18 14:21:32.100> [INFO] Epoch#85. time_cost: 1410.37 s. train_loss: 0.47788703. val_loss: 5.30053854 <2023-08-18 14:21:49.213> [INFO] Epoch#86. time_cost: 1426.53 s. train_loss: 0.48423344. val_loss: 5.29945850 <2023-08-18 14:22:06.621> [INFO] Epoch#87. time_cost: 1442.83 s. train_loss: 0.44990043. val_loss: 5.30025927 <2023-08-18 14:22:23.385> [INFO] Epoch#88. time_cost: 1458.98 s. train_loss: 0.44953348. val_loss: 5.29905891 <2023-08-18 14:22:40.358> [INFO] Epoch#89. time_cost: 1474.94 s. train_loss: 0.43896342. val_loss: 5.30130974 <2023-08-18 14:22:57.296> [INFO] Epoch#90. time_cost: 1491.33 s. train_loss: 0.40358200. val_loss: 5.30313905 <2023-08-18 14:23:14.203> [INFO] Epoch#91. time_cost: 1507.74 s. train_loss: 0.38030928. val_loss: 5.30020889 <2023-08-18 14:23:31.249> [INFO] Epoch#92. time_cost: 1524.19 s. train_loss: 0.36855397. val_loss: 5.29769659 <2023-08-18 14:23:48.304> [INFO] Epoch#93. time_cost: 1540.40 s. train_loss: 0.39343657. val_loss: 5.29662418 <2023-08-18 14:24:05.624> [INFO] Epoch#94. time_cost: 1556.82 s. train_loss: 0.35523611. val_loss: 5.29806995 <2023-08-18 14:24:22.604> [INFO] Epoch#95. time_cost: 1573.18 s. train_loss: 0.38470637. val_loss: 5.29897292 <2023-08-18 14:24:39.336> [INFO] Epoch#96. time_cost: 1589.40 s. train_loss: 0.35899499. val_loss: 5.30027898 <2023-08-18 14:24:56.202> [INFO] Epoch#97. time_cost: 1605.79 s. train_loss: 0.33074107. val_loss: 5.29959599 <2023-08-18 14:25:12.997> [INFO] Epoch#98. time_cost: 1622.00 s. train_loss: 0.31499263. val_loss: 5.29822334 <2023-08-18 14:25:29.901> [INFO] Epoch#99. time_cost: 1638.42 s. train_loss: 0.31733391. val_loss: 5.30082989 <2023-08-18 14:25:46.728> [INFO] Epoch#100. time_cost: 1654.55 s. train_loss: 0.32080573. val_loss: 5.29806964 <2023-08-18 14:26:03.189> [INFO] Epoch#101. time_cost: 1670.50 s. train_loss: 0.30004243. val_loss: 5.29328775 <2023-08-18 14:26:20.261> [INFO] Epoch#102. time_cost: 1686.65 s. train_loss: 0.29755034. val_loss: 5.29279629 <2023-08-18 14:26:37.467> [INFO] Epoch#103. time_cost: 1702.96 s. train_loss: 0.28089099. val_loss: 5.29503473 <2023-08-18 14:26:54.288> [INFO] Epoch#104. time_cost: 1719.07 s. train_loss: 0.28207827. val_loss: 5.29597712 <2023-08-18 14:27:11.176> [INFO] Epoch#105. time_cost: 1735.35 s. train_loss: 0.26215186. val_loss: 5.29589160 <2023-08-18 14:27:27.886> [INFO] Epoch#106. time_cost: 1751.49 s. train_loss: 0.25227861. val_loss: 5.29495859 <2023-08-18 14:27:44.839> [INFO] Epoch#107. time_cost: 1767.86 s. train_loss: 0.27683722. val_loss: 5.29369672 <2023-08-18 14:28:01.458> [INFO] Epoch#108. time_cost: 1783.96 s. train_loss: 0.24133206. val_loss: 5.29565271 <2023-08-18 14:28:18.337> [INFO] Epoch#109. time_cost: 1800.35 s. train_loss: 0.23433661. val_loss: 5.29376634 <2023-08-18 14:28:34.999> [INFO] Epoch#110. time_cost: 1816.54 s. train_loss: 0.23755440. val_loss: 5.29343255 <2023-08-18 14:28:51.802> [INFO] Epoch#111. time_cost: 1832.89 s. train_loss: 0.22447769. val_loss: 5.29052639 <2023-08-18 14:29:08.952> [INFO] Epoch#112. time_cost: 1849.30 s. train_loss: 0.23238146. val_loss: 5.29428466 <2023-08-18 14:29:25.784> [INFO] Epoch#113. time_cost: 1865.56 s. train_loss: 0.22170261. val_loss: 5.29492776 <2023-08-18 14:29:42.793> [INFO] Epoch#114. time_cost: 1882.06 s. train_loss: 0.21545648. val_loss: 5.29312213 <2023-08-18 14:29:59.412> [INFO] Epoch#115. time_cost: 1898.21 s. train_loss: 0.20180116. val_loss: 5.29303535 <2023-08-18 14:30:16.259> [INFO] Epoch#116. time_cost: 1914.53 s. train_loss: 0.19667596. val_loss: 5.29073985 <2023-08-18 14:30:33.082> [INFO] Epoch#117. time_cost: 1930.81 s. train_loss: 0.20202053. val_loss: 5.29082489 <2023-08-18 14:30:49.921> [INFO] Epoch#118. time_cost: 1947.08 s. train_loss: 0.19160863. val_loss: 5.29071124 <2023-08-18 14:31:06.598> [INFO] Epoch#119. time_cost: 1963.19 s. train_loss: 0.19253234. val_loss: 5.29217211 <2023-08-18 14:31:23.262> [INFO] Epoch#120. time_cost: 1979.35 s. train_loss: 0.20008727. val_loss: 5.29381418 <2023-08-18 14:31:40.064> [INFO] Epoch#121. time_cost: 1995.64 s. train_loss: 0.18423576. val_loss: 5.29436827 <2023-08-18 14:31:57.018> [INFO] Epoch#122. time_cost: 2012.03 s. train_loss: 0.18706201. val_loss: 5.29476261 <2023-08-18 14:32:13.724> [INFO] Epoch#123. time_cost: 2028.22 s. train_loss: 0.17524794. val_loss: 5.29642137 <2023-08-18 14:32:30.843> [INFO] Epoch#124. time_cost: 2044.73 s. train_loss: 0.17649993. val_loss: 5.29553525 <2023-08-18 14:32:47.986> [INFO] Epoch#125. time_cost: 2061.28 s. train_loss: 0.16367954. val_loss: 5.29705667 <2023-08-18 14:33:04.760> [INFO] Epoch#126. time_cost: 2077.57 s. train_loss: 0.16235787. val_loss: 5.29822620 <2023-08-18 14:33:21.836> [INFO] Epoch#127. time_cost: 2094.10 s. train_loss: 0.15805034. val_loss: 5.29666471 <2023-08-18 14:33:38.651> [INFO] Epoch#128. time_cost: 2110.38 s. train_loss: 0.16695243. val_loss: 5.29671876 <2023-08-18 14:33:55.408> [INFO] Epoch#129. time_cost: 2126.55 s. train_loss: 0.14886506. val_loss: 5.29992644 <2023-08-18 14:34:12.275> [INFO] Epoch#130. time_cost: 2142.95 s. train_loss: 0.14749421. val_loss: 5.29999638 <2023-08-18 14:34:29.053> [INFO] Epoch#131. time_cost: 2159.16 s. train_loss: 0.16307789. val_loss: 5.29719162 <2023-08-18 14:34:45.788> [INFO] Epoch#132. time_cost: 2175.27 s. train_loss: 0.14282146. val_loss: 5.29932435 <2023-08-18 14:35:02.511> [INFO] Epoch#133. time_cost: 2191.50 s. train_loss: 0.14902505. val_loss: 5.29678647 <2023-08-18 14:35:19.176> [INFO] Epoch#134. time_cost: 2207.62 s. train_loss: 0.16553759. val_loss: 5.29835685 <2023-08-18 14:35:36.006> [INFO] Epoch#135. time_cost: 2223.88 s. train_loss: 0.13034127. val_loss: 5.29728015 <2023-08-18 14:35:52.973> [INFO] Epoch#136. time_cost: 2240.26 s. train_loss: 0.13669445. val_loss: 5.29960585 <2023-08-18 14:36:09.725> [INFO] Epoch#137. time_cost: 2256.47 s. train_loss: 0.13718895. val_loss: 5.30132929 <2023-08-18 14:36:26.503> [INFO] Epoch#138. time_cost: 2272.64 s. train_loss: 0.13112331. val_loss: 5.30023400

lumina37 commented 1 year ago

这种一般是学到了什么奇怪的特征,要不先全部用正面或者全部用反面来试试?记得把DataLoadershuffle设成True,不随机的话更容易学歪

hacktmz commented 1 year ago

单面相同图像训练还是一样无法收敛,已经打开了shuffle设成True

image

<2023-08-18 16:52:27.948> [INFO] Epoch#0. time_cost: 11.26 s. train_loss: 5.30330102. val_loss: 5.32662797 <2023-08-18 16:52:33.756> [INFO] Epoch#1. time_cost: 16.06 s. train_loss: 5.26328055. val_loss: 5.38382506 <2023-08-18 16:52:39.128> [INFO] Epoch#2. time_cost: 20.85 s. train_loss: 5.20550839. val_loss: 5.31866765 <2023-08-18 16:52:44.975> [INFO] Epoch#3. time_cost: 25.73 s. train_loss: 5.07200551. val_loss: 5.38940692 <2023-08-18 16:52:50.417> [INFO] Epoch#4. time_cost: 30.53 s. train_loss: 4.94194833. val_loss: 5.35936069 <2023-08-18 16:52:55.881> [INFO] Epoch#5. time_cost: 35.36 s. train_loss: 4.83208760. val_loss: 5.32527280 <2023-08-18 16:53:01.330> [INFO] Epoch#6. time_cost: 40.19 s. train_loss: 4.71431120. val_loss: 5.40086532 <2023-08-18 16:53:06.812> [INFO] Epoch#7. time_cost: 45.03 s. train_loss: 4.48791242. val_loss: 5.35610247 <2023-08-18 16:53:12.485> [INFO] Epoch#8. time_cost: 50.08 s. train_loss: 4.32728696. val_loss: 5.38680983 <2023-08-18 16:53:17.952> [INFO] Epoch#9. time_cost: 54.94 s. train_loss: 4.13140206. val_loss: 5.36251259 <2023-08-18 16:53:23.518> [INFO] Epoch#10. time_cost: 59.77 s. train_loss: 3.97570495. val_loss: 5.32798958 <2023-08-18 16:53:29.088> [INFO] Epoch#11. time_cost: 64.74 s. train_loss: 3.82368414. val_loss: 5.36804605 <2023-08-18 16:53:34.854> [INFO] Epoch#12. time_cost: 69.90 s. train_loss: 3.66186694. val_loss: 5.29402494 <2023-08-18 16:53:40.715> [INFO] Epoch#13. time_cost: 74.82 s. train_loss: 3.46259562. val_loss: 5.31973505 <2023-08-18 16:53:46.188> [INFO] Epoch#14. time_cost: 79.68 s. train_loss: 3.34844212. val_loss: 5.29534698 <2023-08-18 16:53:51.751> [INFO] Epoch#15. time_cost: 84.59 s. train_loss: 3.12492613. val_loss: 5.39548421 <2023-08-18 16:53:57.348> [INFO] Epoch#16. time_cost: 89.56 s. train_loss: 2.98319157. val_loss: 5.40921092 <2023-08-18 16:54:02.957> [INFO] Epoch#17. time_cost: 94.47 s. train_loss: 2.84793460. val_loss: 5.34042692 <2023-08-18 16:54:08.591> [INFO] Epoch#18. time_cost: 99.37 s. train_loss: 2.65782102. val_loss: 5.36988568 <2023-08-18 16:54:14.213> [INFO] Epoch#19. time_cost: 104.32 s. train_loss: 2.52567454. val_loss: 5.33618855 <2023-08-18 16:54:19.883> [INFO] Epoch#20. time_cost: 109.29 s. train_loss: 2.35375420. val_loss: 5.31661797 <2023-08-18 16:54:25.426> [INFO] Epoch#21. time_cost: 114.18 s. train_loss: 2.25517937. val_loss: 5.37782240 <2023-08-18 16:54:30.948> [INFO] Epoch#22. time_cost: 119.09 s. train_loss: 2.05319774. val_loss: 5.36982632 <2023-08-18 16:54:36.424> [INFO] Epoch#23. time_cost: 123.95 s. train_loss: 1.93894289. val_loss: 5.39411950 <2023-08-18 16:54:42.007> [INFO] Epoch#24. time_cost: 128.88 s. train_loss: 1.76177744. val_loss: 5.34503984 <2023-08-18 16:54:47.459> [INFO] Epoch#25. time_cost: 133.70 s. train_loss: 1.74932716. val_loss: 5.22905540 <2023-08-18 16:54:53.228> [INFO] Epoch#26. time_cost: 138.54 s. train_loss: 1.56955890. val_loss: 5.32931566 <2023-08-18 16:54:58.849> [INFO] Epoch#27. time_cost: 143.47 s. train_loss: 1.49377288. val_loss: 5.37630987 <2023-08-18 16:55:04.515> [INFO] Epoch#28. time_cost: 148.42 s. train_loss: 1.33536245. val_loss: 5.35716152 <2023-08-18 16:55:10.284> [INFO] Epoch#29. time_cost: 153.42 s. train_loss: 1.32433244. val_loss: 5.38517284 <2023-08-18 16:55:15.962> [INFO] Epoch#30. time_cost: 158.48 s. train_loss: 1.14002415. val_loss: 5.52969408 <2023-08-18 16:55:21.536> [INFO] Epoch#31. time_cost: 163.43 s. train_loss: 1.11591223. val_loss: 5.44804335 <2023-08-18 16:55:27.003> [INFO] Epoch#32. time_cost: 168.30 s. train_loss: 1.01697875. val_loss: 5.22046757 <2023-08-18 16:55:32.841> [INFO] Epoch#33. time_cost: 173.26 s. train_loss: 0.95621278. val_loss: 5.43956280 <2023-08-18 16:55:38.391> [INFO] Epoch#34. time_cost: 178.15 s. train_loss: 0.81955061. val_loss: 5.47637320 <2023-08-18 16:55:43.865> [INFO] Epoch#35. time_cost: 183.01 s. train_loss: 0.75798803. val_loss: 5.41070747 <2023-08-18 16:55:49.375> [INFO] Epoch#36. time_cost: 187.90 s. train_loss: 0.72421407. val_loss: 5.15936089 <2023-08-18 16:55:55.175> [INFO] Epoch#37. time_cost: 192.77 s. train_loss: 0.68637379. val_loss: 5.37881804 <2023-08-18 16:56:00.713> [INFO] Epoch#38. time_cost: 197.66 s. train_loss: 0.65562144. val_loss: 5.26347351 <2023-08-18 16:56:06.249> [INFO] Epoch#39. time_cost: 202.59 s. train_loss: 0.56010985. val_loss: 5.29488277 <2023-08-18 16:56:11.710> [INFO] Epoch#40. time_cost: 207.44 s. train_loss: 0.59256203. val_loss: 5.26576877 <2023-08-18 16:56:17.202> [INFO] Epoch#41. time_cost: 212.32 s. train_loss: 0.52209971. val_loss: 5.45372605 <2023-08-18 16:56:22.794> [INFO] Epoch#42. time_cost: 217.24 s. train_loss: 0.54710286. val_loss: 5.31848598 <2023-08-18 16:56:28.444> [INFO] Epoch#43. time_cost: 222.27 s. train_loss: 0.44930240. val_loss: 5.51086330 <2023-08-18 16:56:33.973> [INFO] Epoch#44. time_cost: 227.18 s. train_loss: 0.39940043. val_loss: 5.34111023 <2023-08-18 16:56:39.596> [INFO] Epoch#45. time_cost: 232.18 s. train_loss: 0.39297298. val_loss: 5.43975902 <2023-08-18 16:56:45.118> [INFO] Epoch#46. time_cost: 237.06 s. train_loss: 0.34562535. val_loss: 5.47232461 <2023-08-18 16:56:50.623> [INFO] Epoch#47. time_cost: 241.95 s. train_loss: 0.34902160. val_loss: 5.39059401 <2023-08-18 16:56:56.181> [INFO] Epoch#48. time_cost: 246.91 s. train_loss: 0.31559715. val_loss: 5.29491401 <2023-08-18 16:57:01.753> [INFO] Epoch#49. time_cost: 251.86 s. train_loss: 0.31622377. val_loss: 5.41225433 <2023-08-18 16:57:07.428> [INFO] Epoch#50. time_cost: 256.90 s. train_loss: 0.27491240. val_loss: 5.36622477 <2023-08-18 16:57:13.041> [INFO] Epoch#51. time_cost: 261.90 s. train_loss: 0.26355119. val_loss: 5.40250540 <2023-08-18 16:57:18.648> [INFO] Epoch#52. time_cost: 266.88 s. train_loss: 0.27926902. val_loss: 5.48708844 <2023-08-18 16:57:24.032> [INFO] Epoch#53. time_cost: 271.68 s. train_loss: 0.23514211. val_loss: 5.40322447 <2023-08-18 16:57:29.688> [INFO] Epoch#54. time_cost: 276.72 s. train_loss: 0.22060247. val_loss: 5.42269874 <2023-08-18 16:57:35.271> [INFO] Epoch#55. time_cost: 281.60 s. train_loss: 0.22819298. val_loss: 5.53691554 <2023-08-18 16:57:40.751> [INFO] Epoch#56. time_cost: 286.48 s. train_loss: 0.23862205. val_loss: 5.37165403 <2023-08-18 16:57:46.264> [INFO] Epoch#57. time_cost: 291.37 s. train_loss: 0.19607902. val_loss: 5.45195055 <2023-08-18 16:57:51.808> [INFO] Epoch#58. time_cost: 296.29 s. train_loss: 0.18923844. val_loss: 5.23308468 <2023-08-18 16:57:57.512> [INFO] Epoch#59. time_cost: 301.35 s. train_loss: 0.17699011. val_loss: 5.42624283 <2023-08-18 16:58:03.158> [INFO] Epoch#60. time_cost: 306.36 s. train_loss: 0.16317427. val_loss: 5.39616227 <2023-08-18 16:58:08.804> [INFO] Epoch#61. time_cost: 311.37 s. train_loss: 0.19332631. val_loss: 5.45374703 <2023-08-18 16:58:14.428> [INFO] Epoch#62. time_cost: 316.35 s. train_loss: 0.21091129. val_loss: 5.40493679 <2023-08-18 16:58:20.010> [INFO] Epoch#63. time_cost: 321.29 s. train_loss: 0.16297683. val_loss: 5.45400238 <2023-08-18 16:58:25.555> [INFO] Epoch#64. time_cost: 326.21 s. train_loss: 0.16020645. val_loss: 5.27163005 <2023-08-18 16:58:31.197> [INFO] Epoch#65. time_cost: 331.14 s. train_loss: 0.14810304. val_loss: 5.25995469 <2023-08-18 16:58:36.691> [INFO] Epoch#66. time_cost: 336.01 s. train_loss: 0.16113032. val_loss: 5.52074027 <2023-08-18 16:58:42.176> [INFO] Epoch#67. time_cost: 340.87 s. train_loss: 0.15936315. val_loss: 5.49150276 <2023-08-18 16:58:47.723> [INFO] Epoch#68. time_cost: 345.82 s. train_loss: 0.15403310. val_loss: 5.33653975 <2023-08-18 16:58:53.258> [INFO] Epoch#69. time_cost: 350.73 s. train_loss: 0.13249908. val_loss: 5.39583731 <2023-08-18 16:58:58.856> [INFO] Epoch#70. time_cost: 355.70 s. train_loss: 0.13025107. val_loss: 5.31142569 <2023-08-18 16:59:04.342> [INFO] Epoch#71. time_cost: 360.57 s. train_loss: 0.12579903. val_loss: 5.47937417 <2023-08-18 16:59:09.908> [INFO] Epoch#72. time_cost: 365.51 s. train_loss: 0.11129420. val_loss: 5.40961981 <2023-08-18 16:59:15.418> [INFO] Epoch#73. time_cost: 370.42 s. train_loss: 0.12030931. val_loss: 5.46381927 <2023-08-18 16:59:21.051> [INFO] Epoch#74. time_cost: 375.42 s. train_loss: 0.12002229. val_loss: 5.33053684 <2023-08-18 16:59:26.585> [INFO] Epoch#75. time_cost: 380.34 s. train_loss: 0.14545687. val_loss: 5.56017542 <2023-08-18 16:59:32.247> [INFO] Epoch#76. time_cost: 385.28 s. train_loss: 0.11673824. val_loss: 5.28624845 <2023-08-18 16:59:37.766> [INFO] Epoch#77. time_cost: 390.18 s. train_loss: 0.10143876. val_loss: 5.39271259 <2023-08-18 16:59:43.311> [INFO] Epoch#78. time_cost: 395.10 s. train_loss: 0.11019807. val_loss: 5.31737995 <2023-08-18 16:59:48.822> [INFO] Epoch#79. time_cost: 400.02 s. train_loss: 0.09523510. val_loss: 5.51333284 <2023-08-18 16:59:54.437> [INFO] Epoch#80. time_cost: 405.01 s. train_loss: 0.09562507. val_loss: 5.39607668 <2023-08-18 16:59:59.961> [INFO] Epoch#81. time_cost: 409.91 s. train_loss: 0.09793488. val_loss: 5.46005845 <2023-08-18 17:00:05.540> [INFO] Epoch#82. time_cost: 414.87 s. train_loss: 0.09791914. val_loss: 5.41902351 <2023-08-18 17:00:11.215> [INFO] Epoch#83. time_cost: 419.93 s. train_loss: 0.11880134. val_loss: 5.33904815 <2023-08-18 17:00:16.760> [INFO] Epoch#84. time_cost: 424.86 s. train_loss: 0.09215725. val_loss: 5.45071840 <2023-08-18 17:00:22.240> [INFO] Epoch#85. time_cost: 429.71 s. train_loss: 0.08812353. val_loss: 5.31730056 <2023-08-18 17:00:27.677> [INFO] Epoch#86. time_cost: 434.55 s. train_loss: 0.08547030. val_loss: 5.34717917 <2023-08-18 17:00:33.119> [INFO] Epoch#87. time_cost: 439.39 s. train_loss: 0.09604750. val_loss: 5.15446329 <2023-08-18 17:00:39.004> [INFO] Epoch#88. time_cost: 444.35 s. train_loss: 0.07881441. val_loss: 5.36302662 <2023-08-18 17:00:44.469> [INFO] Epoch#89. time_cost: 449.20 s. train_loss: 0.10159408. val_loss: 5.35543585 <2023-08-18 17:00:50.156> [INFO] Epoch#90. time_cost: 454.19 s. train_loss: 0.07790474. val_loss: 5.45754957 <2023-08-18 17:00:55.691> [INFO] Epoch#91. time_cost: 459.11 s. train_loss: 0.08819332. val_loss: 5.41740751 <2023-08-18 17:01:01.184> [INFO] Epoch#92. time_cost: 463.99 s. train_loss: 0.07858732. val_loss: 5.43756819 <2023-08-18 17:01:06.740> [INFO] Epoch#93. time_cost: 468.93 s. train_loss: 0.08139104. val_loss: 5.48051214 <2023-08-18 17:01:12.295> [INFO] Epoch#94. time_cost: 473.87 s. train_loss: 0.07225644. val_loss: 5.27634001 <2023-08-18 17:01:17.880> [INFO] Epoch#95. time_cost: 478.86 s. train_loss: 0.07885717. val_loss: 5.18226361 <2023-08-18 17:01:23.371> [INFO] Epoch#96. time_cost: 483.72 s. train_loss: 0.06458863. val_loss: 5.64266992 <2023-08-18 17:01:29.011> [INFO] Epoch#97. time_cost: 488.74 s. train_loss: 0.07287594. val_loss: 5.45775628 <2023-08-18 17:01:34.560> [INFO] Epoch#98. time_cost: 493.68 s. train_loss: 0.08625005. val_loss: 5.48485804 <2023-08-18 17:01:40.021> [INFO] Epoch#99. time_cost: 498.53 s. train_loss: 0.06330816. val_loss: 5.21741295 <2023-08-18 17:01:45.493> [INFO] Epoch#100. time_cost: 503.40 s. train_loss: 0.06806743. val_loss: 5.43132496 <2023-08-18 17:01:51.108> [INFO] Epoch#101. time_cost: 508.34 s. train_loss: 0.06728531. val_loss: 5.22270417 <2023-08-18 17:01:56.643> [INFO] Epoch#102. time_cost: 513.26 s. train_loss: 0.07651205. val_loss: 5.60741663 <2023-08-18 17:02:02.148> [INFO] Epoch#103. time_cost: 518.15 s. train_loss: 0.06601417. val_loss: 5.32008457 <2023-08-18 17:02:07.644> [INFO] Epoch#104. time_cost: 523.01 s. train_loss: 0.06601928. val_loss: 5.35806060 <2023-08-18 17:02:13.296> [INFO] Epoch#105. time_cost: 528.05 s. train_loss: 0.06467549. val_loss: 5.35822177 <2023-08-18 17:02:18.809> [INFO] Epoch#106. time_cost: 532.95 s. train_loss: 0.07210446. val_loss: 5.37992096 <2023-08-18 17:02:24.311> [INFO] Epoch#107. time_cost: 537.84 s. train_loss: 0.06643014. val_loss: 5.42277884 <2023-08-18 17:02:29.885> [INFO] Epoch#108. time_cost: 542.81 s. train_loss: 0.05864677. val_loss: 5.73795843 <2023-08-18 17:02:35.345> [INFO] Epoch#109. time_cost: 547.65 s. train_loss: 0.06491178. val_loss: 5.45618892 <2023-08-18 17:02:40.759> [INFO] Epoch#110. time_cost: 552.44 s. train_loss: 0.05744175. val_loss: 5.30905938 <2023-08-18 17:02:46.252> [INFO] Epoch#111. time_cost: 557.33 s. train_loss: 0.05695161. val_loss: 5.29266262 <2023-08-18 17:02:51.874> [INFO] Epoch#112. time_cost: 562.23 s. train_loss: 0.05810733. val_loss: 5.36781240 <2023-08-18 17:02:57.450> [INFO] Epoch#113. time_cost: 567.18 s. train_loss: 0.06044777. val_loss: 5.46884584 <2023-08-18 17:03:02.980> [INFO] Epoch#114. time_cost: 572.10 s. train_loss: 0.05924652. val_loss: 5.33167601 <2023-08-18 17:03:08.519> [INFO] Epoch#115. time_cost: 577.01 s. train_loss: 0.05411220. val_loss: 5.51172185 <2023-08-18 17:03:14.046> [INFO] Epoch#116. time_cost: 581.93 s. train_loss: 0.05188428. val_loss: 5.42046356 <2023-08-18 17:03:19.544> [INFO] Epoch#117. time_cost: 586.83 s. train_loss: 0.05135338. val_loss: 5.55253291 <2023-08-18 17:03:25.127> [INFO] Epoch#118. time_cost: 591.77 s. train_loss: 0.05763117. val_loss: 5.43751192 <2023-08-18 17:03:30.580> [INFO] Epoch#119. time_cost: 596.62 s. train_loss: 0.05355322. val_loss: 5.45424414 <2023-08-18 17:03:36.110> [INFO] Epoch#120. time_cost: 601.52 s. train_loss: 0.04620792. val_loss: 5.34147072 <2023-08-18 17:03:41.649> [INFO] Epoch#121. time_cost: 606.44 s. train_loss: 0.04430308. val_loss: 5.37313151 <2023-08-18 17:03:47.080> [INFO] Epoch#122. time_cost: 611.25 s. train_loss: 0.05001135. val_loss: 5.42952180 <2023-08-18 17:03:52.746> [INFO] Epoch#123. time_cost: 616.30 s. train_loss: 0.05492518. val_loss: 5.32263398 <2023-08-18 17:03:58.400> [INFO] Epoch#124. time_cost: 621.26 s. train_loss: 0.04963074. val_loss: 5.21857309 <2023-08-18 17:04:03.965> [INFO] Epoch#125. time_cost: 626.20 s. train_loss: 0.05894958. val_loss: 5.35917640 <2023-08-18 17:04:09.444> [INFO] Epoch#126. time_cost: 631.10 s. train_loss: 0.04274863. val_loss: 5.38477874 <2023-08-18 17:04:14.867> [INFO] Epoch#127. time_cost: 635.93 s. train_loss: 0.05253102. val_loss: 5.48755050 <2023-08-18 17:04:20.483> [INFO] Epoch#128. time_cost: 640.95 s. train_loss: 0.04418161. val_loss: 5.36615944 <2023-08-18 17:04:25.961> [INFO] Epoch#129. time_cost: 645.82 s. train_loss: 0.05168360. val_loss: 5.40945005 <2023-08-18 17:04:31.403> [INFO] Epoch#130. time_cost: 650.65 s. train_loss: 0.04251063. val_loss: 5.39642477 <2023-08-18 17:04:36.908> [INFO] Epoch#131. time_cost: 655.55 s. train_loss: 0.04522851. val_loss: 5.38135076 <2023-08-18 17:04:42.498> [INFO] Epoch#132. time_cost: 660.52 s. train_loss: 0.04688337. val_loss: 5.53917551 <2023-08-18 17:04:48.012> [INFO] Epoch#133. time_cost: 665.43 s. train_loss: 0.04650050. val_loss: 5.37437010 <2023-08-18 17:04:53.596> [INFO] Epoch#134. time_cost: 670.34 s. train_loss: 0.04531294. val_loss: 5.32696652 <2023-08-18 17:04:59.098> [INFO] Epoch#135. time_cost: 675.23 s. train_loss: 0.04147261. val_loss: 5.20313597 <2023-08-18 17:05:04.667> [INFO] Epoch#136. time_cost: 680.18 s. train_loss: 0.04341542. val_loss: 5.24532628 <2023-08-18 17:05:10.237> [INFO] Epoch#137. time_cost: 685.15 s. train_loss: 0.04142874. val_loss: 5.45382905 <2023-08-18 17:05:15.721> [INFO] Epoch#138. time_cost: 690.01 s. train_loss: 0.04173257. val_loss: 5.47300363 <2023-08-18 17:05:21.211> [INFO] Epoch#139. time_cost: 694.90 s. train_loss: 0.04357434. val_loss: 5.33823633 <2023-08-18 17:05:26.749> [INFO] Epoch#140. time_cost: 699.83 s. train_loss: 0.04733616. val_loss: 5.46602774 <2023-08-18 17:05:32.240> [INFO] Epoch#141. time_cost: 704.71 s. train_loss: 0.04136438. val_loss: 5.32237291 <2023-08-18 17:05:37.721> [INFO] Epoch#142. time_cost: 709.58 s. train_loss: 0.03627620. val_loss: 5.42067575 <2023-08-18 17:05:43.188> [INFO] Epoch#143. time_cost: 714.45 s. train_loss: 0.03925915. val_loss: 5.46165037 <2023-08-18 17:05:48.682> [INFO] Epoch#144. time_cost: 719.34 s. train_loss: 0.03792857. val_loss: 5.46521688 <2023-08-18 17:05:54.215> [INFO] Epoch#145. time_cost: 724.17 s. train_loss: 0.03960869. val_loss: 5.41369390 <2023-08-18 17:05:59.723> [INFO] Epoch#146. time_cost: 729.06 s. train_loss: 0.03514957. val_loss: 5.48138380 <2023-08-18 17:06:05.231> [INFO] Epoch#147. time_cost: 733.95 s. train_loss: 0.04448471. val_loss: 5.33512712 <2023-08-18 17:06:10.745> [INFO] Epoch#148. time_cost: 738.85 s. train_loss: 0.03458080. val_loss: 5.19379210 <2023-08-18 17:06:16.220> [INFO] Epoch#149. time_cost: 743.70 s. train_loss: 0.03476699. val_loss: 5.42594194 <2023-08-18 17:06:21.759> [INFO] Epoch#150. time_cost: 748.64 s. train_loss: 0.03833890. val_loss: 5.37024045 <2023-08-18 17:06:27.295> [INFO] Epoch#151. time_cost: 753.54 s. train_loss: 0.03714462. val_loss: 5.34531856 <2023-08-18 17:06:32.944> [INFO] Epoch#152. time_cost: 758.58 s. train_loss: 0.04011831. val_loss: 5.32632899 <2023-08-18 17:06:38.455> [INFO] Epoch#153. time_cost: 763.46 s. train_loss: 0.04507950. val_loss: 5.28544354 <2023-08-18 17:06:44.056> [INFO] Epoch#154. time_cost: 768.42 s. train_loss: 0.03567235. val_loss: 5.37078619 <2023-08-18 17:06:49.593> [INFO] Epoch#155. time_cost: 773.35 s. train_loss: 0.03932706. val_loss: 5.34415340 <2023-08-18 17:06:55.210> [INFO] Epoch#156. time_cost: 778.32 s. train_loss: 0.03472318. val_loss: 5.40771365 <2023-08-18 17:07:00.810> [INFO] Epoch#157. time_cost: 783.23 s. train_loss: 0.03815113. val_loss: 5.60919714 <2023-08-18 17:07:06.332> [INFO] Epoch#158. time_cost: 788.14 s. train_loss: 0.03213267. val_loss: 5.38632631 <2023-08-18 17:07:11.789> [INFO] Epoch#159. time_cost: 792.99 s. train_loss: 0.03157291. val_loss: 5.62310457 <2023-08-18 17:07:17.197> [INFO] Epoch#160. time_cost: 797.84 s. train_loss: 0.03437347. val_loss: 5.31411767 <2023-08-18 17:07:22.710> [INFO] Epoch#161. time_cost: 802.76 s. train_loss: 0.03174445. val_loss: 5.53535295 <2023-08-18 17:07:28.216> [INFO] Epoch#162. time_cost: 807.64 s. train_loss: 0.03642854. val_loss: 5.44857097 <2023-08-18 17:07:33.753> [INFO] Epoch#163. time_cost: 812.56 s. train_loss: 0.03894210. val_loss: 5.47438121 <2023-08-18 17:07:39.311> [INFO] Epoch#164. time_cost: 817.51 s. train_loss: 0.03483183. val_loss: 5.35244107 <2023-08-18 17:07:44.898> [INFO] Epoch#165. time_cost: 822.46 s. train_loss: 0.03730110. val_loss: 5.42034888 <2023-08-18 17:07:50.396> [INFO] Epoch#166. time_cost: 827.34 s. train_loss: 0.03387760. val_loss: 5.33185983 <2023-08-18 17:07:56.042> [INFO] Epoch#167. time_cost: 832.29 s. train_loss: 0.03823958. val_loss: 5.32589149 <2023-08-18 17:08:01.592> [INFO] Epoch#168. time_cost: 837.25 s. train_loss: 0.03560706. val_loss: 5.39625525 <2023-08-18 17:08:07.178> [INFO] Epoch#169. time_cost: 842.21 s. train_loss: 0.02904032. val_loss: 5.35998750 <2023-08-18 17:08:12.667> [INFO] Epoch#170. time_cost: 847.10 s. train_loss: 0.03341103. val_loss: 5.49449563 <2023-08-18 17:08:18.180> [INFO] Epoch#171. time_cost: 852.01 s. train_loss: 0.03474875. val_loss: 5.48542976 <2023-08-18 17:08:23.746> [INFO] Epoch#172. time_cost: 856.94 s. train_loss: 0.03299613. val_loss: 5.52249503 <2023-08-18 17:08:29.276> [INFO] Epoch#173. time_cost: 861.83 s. train_loss: 0.03398239. val_loss: 5.63685536 <2023-08-18 17:08:34.753> [INFO] Epoch#174. time_cost: 866.69 s. train_loss: 0.02945490. val_loss: 5.42094088 <2023-08-18 17:08:40.240> [INFO] Epoch#175. time_cost: 871.56 s. train_loss: 0.02840251. val_loss: 5.69379497 <2023-08-18 17:08:45.826> [INFO] Epoch#176. time_cost: 876.53 s. train_loss: 0.02756416. val_loss: 5.56099296 <2023-08-18 17:08:51.384> [INFO] Epoch#177. time_cost: 881.48 s. train_loss: 0.03283496. val_loss: 5.40805626 <2023-08-18 17:08:57.047> [INFO] Epoch#178. time_cost: 886.44 s. train_loss: 0.03240528. val_loss: 5.31137824 <2023-08-18 17:09:02.554> [INFO] Epoch#179. time_cost: 891.30 s. train_loss: 0.02928356. val_loss: 5.42326951 <2023-08-18 17:09:08.079> [INFO] Epoch#180. time_cost: 896.20 s. train_loss: 0.03587892. val_loss: 5.45484829 <2023-08-18 17:09:13.542> [INFO] Epoch#181. time_cost: 901.07 s. train_loss: 0.04012054. val_loss: 5.44649816 <2023-08-18 17:09:19.104> [INFO] Epoch#182. time_cost: 906.01 s. train_loss: 0.02857378. val_loss: 5.39741778 <2023-08-18 17:09:24.655> [INFO] Epoch#183. time_cost: 910.94 s. train_loss: 0.02655693. val_loss: 5.41928601 <2023-08-18 17:09:30.200> [INFO] Epoch#184. time_cost: 915.86 s. train_loss: 0.03125123. val_loss: 5.58228779 <2023-08-18 17:09:35.700> [INFO] Epoch#185. time_cost: 920.73 s. train_loss: 0.03121312. val_loss: 5.44002223 <2023-08-18 17:09:41.265> [INFO] Epoch#186. time_cost: 925.68 s. train_loss: 0.02763961. val_loss: 5.45899653 <2023-08-18 17:09:46.883> [INFO] Epoch#187. time_cost: 930.69 s. train_loss: 0.02685601. val_loss: 5.48374534 <2023-08-18 17:09:52.440> [INFO] Epoch#188. time_cost: 935.61 s. train_loss: 0.02776013. val_loss: 5.46111584 <2023-08-18 17:09:57.927> [INFO] Epoch#189. time_cost: 940.50 s. train_loss: 0.02638166. val_loss: 5.60892177 <2023-08-18 17:10:03.526> [INFO] Epoch#190. time_cost: 945.49 s. train_loss: 0.02653074. val_loss: 5.13279104 <2023-08-18 17:10:09.458> [INFO] Epoch#191. time_cost: 950.45 s. train_loss: 0.02427226. val_loss: 5.58440328 <2023-08-18 17:10:14.906> [INFO] Epoch#192. time_cost: 955.29 s. train_loss: 0.03171158. val_loss: 5.61189842 <2023-08-18 17:10:20.516> [INFO] Epoch#193. time_cost: 960.28 s. train_loss: 0.02493854. val_loss: 5.39307165 <2023-08-18 17:10:26.003> [INFO] Epoch#194. time_cost: 965.17 s. train_loss: 0.02884255. val_loss: 5.54699516 <2023-08-18 17:10:31.489> [INFO] Epoch#195. time_cost: 970.02 s. train_loss: 0.03158595. val_loss: 5.52103448 <2023-08-18 17:10:36.968> [INFO] Epoch#196. time_cost: 974.90 s. train_loss: 0.02557721. val_loss: 5.46148872 <2023-08-18 17:10:42.548> [INFO] Epoch#197. time_cost: 979.86 s. train_loss: 0.02455417. val_loss: 5.64398980 <2023-08-18 17:10:48.120> [INFO] Epoch#198. time_cost: 984.81 s. train_loss: 0.02883675. val_loss: 5.24348927 <2023-08-18 17:10:53.642> [INFO] Epoch#199. time_cost: 989.71 s. train_loss: 0.02697669. val_loss: 5.49870038 <2023-08-18 17:10:59.159> [INFO] Epoch#200. time_cost: 994.59 s. train_loss: 0.02243770. val_loss: 5.45685840 <2023-08-18 17:11:04.705> [INFO] Epoch#201. time_cost: 999.52 s. train_loss: 0.02589454. val_loss: 5.39851737 <2023-08-18 17:11:10.261> [INFO] Epoch#202. time_cost: 1004.47 s. train_loss: 0.02893260. val_loss: 5.52150702 <2023-08-18 17:11:15.859> [INFO] Epoch#203. time_cost: 1009.46 s. train_loss: 0.02829824. val_loss: 5.46824288 <2023-08-18 17:11:21.350> [INFO] Epoch#204. time_cost: 1014.35 s. train_loss: 0.02839780. val_loss: 5.33534241 <2023-08-18 17:11:26.858> [INFO] Epoch#205. time_cost: 1019.22 s. train_loss: 0.02283491. val_loss: 5.48758030 <2023-08-18 17:11:32.283> [INFO] Epoch#206. time_cost: 1024.01 s. train_loss: 0.02938521. val_loss: 5.45340896 <2023-08-18 17:11:37.778> [INFO] Epoch#207. time_cost: 1028.89 s. train_loss: 0.02286907. val_loss: 5.38048625 <2023-08-18 17:11:43.357> [INFO] Epoch#208. time_cost: 1033.86 s. train_loss: 0.02554865. val_loss: 5.55267978 <2023-08-18 17:11:48.878> [INFO] Epoch#209. time_cost: 1038.78 s. train_loss: 0.02529276. val_loss: 5.46401834 <2023-08-18 17:11:54.392> [INFO] Epoch#210. time_cost: 1043.68 s. train_loss: 0.02222722. val_loss: 5.41124010 <2023-08-18 17:11:59.876> [INFO] Epoch#211. time_cost: 1048.55 s. train_loss: 0.02464435. val_loss: 5.40487456 <2023-08-18 17:12:05.365> [INFO] Epoch#212. time_cost: 1053.39 s. train_loss: 0.02601153. val_loss: 5.42090082 <2023-08-18 17:12:10.882> [INFO] Epoch#213. time_cost: 1058.29 s. train_loss: 0.02078980. val_loss: 5.40114021 <2023-08-18 17:12:16.442> [INFO] Epoch#214. time_cost: 1063.26 s. train_loss: 0.02538042. val_loss: 5.50385857 <2023-08-18 17:12:21.926> [INFO] Epoch#215. time_cost: 1068.14 s. train_loss: 0.02432403. val_loss: 5.45867205 <2023-08-18 17:12:27.432> [INFO] Epoch#216. time_cost: 1073.02 s. train_loss: 0.02621710. val_loss: 5.62054467 <2023-08-18 17:12:32.867> [INFO] Epoch#217. time_cost: 1077.86 s. train_loss: 0.02467208. val_loss: 5.46568441 <2023-08-18 17:12:38.490> [INFO] Epoch#218. time_cost: 1082.77 s. train_loss: 0.02607746. val_loss: 5.49830675 <2023-08-18 17:12:44.123> [INFO] Epoch#219. time_cost: 1087.77 s. train_loss: 0.02776598. val_loss: 5.42323661 <2023-08-18 17:12:49.695> [INFO] Epoch#220. time_cost: 1092.72 s. train_loss: 0.02273856. val_loss: 5.42024589 <2023-08-18 17:12:55.296> [INFO] Epoch#221. time_cost: 1097.74 s. train_loss: 0.02539457. val_loss: 5.26896286 <2023-08-18 17:13:00.879> [INFO] Epoch#222. time_cost: 1102.70 s. train_loss: 0.02268215. val_loss: 5.46498084 <2023-08-18 17:13:06.403> [INFO] Epoch#223. time_cost: 1107.60 s. train_loss: 0.02315321. val_loss: 5.50621009 <2023-08-18 17:13:11.997> [INFO] Epoch#224. time_cost: 1112.57 s. train_loss: 0.02258990. val_loss: 5.49236298 <2023-08-18 17:13:17.514> [INFO] Epoch#225. time_cost: 1117.48 s. train_loss: 0.02169418. val_loss: 5.61205339 <2023-08-18 17:13:22.977> [INFO] Epoch#226. time_cost: 1122.33 s. train_loss: 0.02059807. val_loss: 5.51667738 <2023-08-18 17:13:28.632> [INFO] Epoch#227. time_cost: 1127.37 s. train_loss: 0.02105716. val_loss: 5.38954735 <2023-08-18 17:13:34.130> [INFO] Epoch#228. time_cost: 1132.24 s. train_loss: 0.02462164. val_loss: 5.44811368 <2023-08-18 17:13:39.678> [INFO] Epoch#229. time_cost: 1137.15 s. train_loss: 0.02404069. val_loss: 5.41940117 <2023-08-18 17:13:45.329> [INFO] Epoch#230. time_cost: 1142.18 s. train_loss: 0.02199012. val_loss: 5.49373794 <2023-08-18 17:13:50.942> [INFO] Epoch#231. time_cost: 1147.18 s. train_loss: 0.02008617. val_loss: 5.40747380 <2023-08-18 17:13:56.539> [INFO] Epoch#232. time_cost: 1152.15 s. train_loss: 0.01896449. val_loss: 5.38771868 <2023-08-18 17:14:02.081> [INFO] Epoch#233. time_cost: 1157.07 s. train_loss: 0.02353778. val_loss: 5.31932235 <2023-08-18 17:14:07.676> [INFO] Epoch#234. time_cost: 1162.06 s. train_loss: 0.02101104. val_loss: 5.33468461

hacktmz commented 1 year ago

`import argparse from pathlib import Path

import torch from torch.nn import CrossEntropyLoss from torch.utils.data import DataLoader

from rotate_captcha_crack.common import device from rotate_captcha_crack.dataset import ImgTsSeqFromPath, RotDataset, from_google_streetview, from_pcgs_pics from rotate_captcha_crack.helper import default_num_workers from rotate_captcha_crack.lr import LRManager from rotate_captcha_crack.model import RotNetR from rotate_captcha_crack.trainer import Trainer from rotate_captcha_crack.utils import slice_from_range from rotate_captcha_crack.visualizer import visualize_train

if name == "main": parser = argparse.ArgumentParser() parser.add_argument( "--resume", "-r", type=int, default=None, help="Resume from which index. -1 leads to the last training process" ) opts = parser.parse_args()

#################################
### Custom configuration area ###
dataset_root = Path("test/1C Lincoln Cent (Wheat Reverse)-obverse/")

# img_paths = from_google_streetview(dataset_root)
img_paths = from_pcgs_pics(dataset_root)
# print(img_paths[0])
cls_num = 180
train_img_paths = slice_from_range(img_paths, (0.0, 0.70))
train_dataset = RotDataset(ImgTsSeqFromPath(train_img_paths), cls_num=cls_num)
val_img_paths = slice_from_range(img_paths, (0.70, 1.0))
for v in val_img_paths:
    print(v)
val_dataset = RotDataset(ImgTsSeqFromPath(val_img_paths), cls_num=cls_num)

num_workers = default_num_workers()
train_dataloader = DataLoader(
    train_dataset,
    batch_size=32,
    num_workers=num_workers,
    shuffle=True,
    drop_last=True,
)
val_dataloader = DataLoader(
    val_dataset,
    batch_size=32,
    num_workers=num_workers,
    drop_last=True,
    shuffle=True,
)

model = RotNetR(cls_num)
model = model.to(device)

lr = 0.01
momentum = 0.9
epochs = 300
steps = 128
optimizer = torch.optim.SGD(model.parameters(), lr=lr, momentum=momentum)
scheduler = torch.optim.lr_scheduler.OneCycleLR(
    optimizer, max_lr=lr, pct_start=0.25, epochs=epochs, steps_per_epoch=steps
)
lr = LRManager(lr, scheduler, optimizer)
loss = CrossEntropyLoss()

trainer = Trainer(model, train_dataloader, val_dataloader, lr, loss, epochs, steps)
### Custom configuration area ###
#################################

if opts.resume is not None:
    trainer.resume(opts.resume)

trainer.train()

visualize_train(trainer.finder.model_dir)

`

lumina37 commented 1 year ago

我刚刚看到你的训练时间才意识到应该是数据量太少再加上纹理类型也很少(就两种)的问题,如果你的应用场景就是特定的几个图样那不如直接用传统的特征点匹配

hacktmz commented 1 year ago

添加了50000张不同钱币图片用于训练,但是loss还是只有0.6降不下去了。传统的特征点匹配的方法已经尝试了,因为每个钱币色泽和新旧磨损不一样,所以并不好用

image image
hacktmz commented 1 year ago

改成RotNet LOSS就下来了

lumina37 commented 1 year ago

莫非是因为分类数变多了