Open DekHub opened 2 years ago
First of all, thank you for opening the source code, but I had a problem with your code run that I couldn't solve. The data set is also the data set downloaded from the link you gave, and the parameters have not changed, except that the Batch size has been changed to 48.I'm dying for some Pointers or suggestions. [Epoch 40/5001] [Iter 0/1000] [VD loss: 0.657199] [VG loss: 2.012747] [ID loss: 0.650697] [IG loss: 2.070867] [global_steps 1001/5001] [Iter 1/1001] [VD loss: 0.657033] [VG loss: 2.063967] [ID loss: 0.650411] [IG loss: 2.018135] ] [global_steps 1002/5001] [Iter 2/1002] [VD loss: 0.655559] [VG loss: 2.066293] [ID loss: 0.650710] [IG loss: 2.171137] [global_steps 1003/5001] [Iter 3/1003] [VD loss: 0.657128] [VG loss: 2.062407] [ID loss: 0.650633] [IG loss: 2.049448] [global_steps 1004/5001] [Iter 4/1004] [VD loss: 0.654847] [VG loss: 2.064400] [ID loss: 0.650288] [IG loss: 2.051192] [global_steps 1005/5001] [Iter 5/1005] [VD loss: 0.654318] [VG loss: 2.066402] [ID loss: 0.650515] [IG loss: 2.153673] [global_steps 1006/5001] [Iter 6/1006] [VD loss: 0.654598] [VG loss: 2.061100] [ID loss: 0.650975] [IG loss: 1.974542] [global_steps 1007/5001] [Iter 7/1007] [VD loss: 0.654075] [VG loss: 2.059378] [ID loss: 0.651421] [IG loss: 2.193369] [global_steps 1008/5001] [Iter 8/1008] [VD loss: 0.655033] [VG loss: 2.063726] [ID loss: 0.651415] [IG loss: 1.996405] [global_steps 1009/5001] [Iter 9/1009] [VD loss: 0.653542] [VG loss: 2.068931] [ID loss: 0.651095] [IG loss: 2.157854] [global_steps 1010/5001] [Iter 10/1010] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan] [global_steps 1011/5001] [Iter 11/1011] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan] [global_steps 1012/5001] [Iter 12/1012] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan]
@DekHub Hi, sorry for the late reply. Have you tried to increase the batch size?
First of all, thank you for opening the source code, but I had a problem with your code run that I couldn't solve. The data set is also the data set downloaded from the link you gave, and the parameters have not changed, except that the Batch size has been changed to 48.I'm dying for some Pointers or suggestions. [Epoch 40/5001] [Iter 0/1000] [VD loss: 0.657199] [VG loss: 2.012747] [ID loss: 0.650697] [IG loss: 2.070867] [global_steps 1001/5001] [Iter 1/1001] [VD loss: 0.657033] [VG loss: 2.063967] [ID loss: 0.650411] [IG loss: 2.018135] ] [global_steps 1002/5001] [Iter 2/1002] [VD loss: 0.655559] [VG loss: 2.066293] [ID loss: 0.650710] [IG loss: 2.171137] [global_steps 1003/5001] [Iter 3/1003] [VD loss: 0.657128] [VG loss: 2.062407] [ID loss: 0.650633] [IG loss: 2.049448] [global_steps 1004/5001] [Iter 4/1004] [VD loss: 0.654847] [VG loss: 2.064400] [ID loss: 0.650288] [IG loss: 2.051192] [global_steps 1005/5001] [Iter 5/1005] [VD loss: 0.654318] [VG loss: 2.066402] [ID loss: 0.650515] [IG loss: 2.153673] [global_steps 1006/5001] [Iter 6/1006] [VD loss: 0.654598] [VG loss: 2.061100] [ID loss: 0.650975] [IG loss: 1.974542] [global_steps 1007/5001] [Iter 7/1007] [VD loss: 0.654075] [VG loss: 2.059378] [ID loss: 0.651421] [IG loss: 2.193369] [global_steps 1008/5001] [Iter 8/1008] [VD loss: 0.655033] [VG loss: 2.063726] [ID loss: 0.651415] [IG loss: 1.996405] [global_steps 1009/5001] [Iter 9/1009] [VD loss: 0.653542] [VG loss: 2.068931] [ID loss: 0.651095] [IG loss: 2.157854] [global_steps 1010/5001] [Iter 10/1010] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan] [global_steps 1011/5001] [Iter 11/1011] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan] [global_steps 1012/5001] [Iter 12/1012] [VD loss: nan] [VG loss: nan] [ID loss: nan] [IG loss: nan]