Open lingtengqiu opened 5 years ago
At the begin of training, it's ok. After several epochs, the loss will decrease. @lingtengqiu
I try , at the begin of training the loss is 16 ,at the end of epoch the loss is 4.0
我尝试,在训练开始时损失是16,在时代结束时损失是4.0
请问一下,数据是如何合并的,我尝试了许多方法,但是没有一个正确的,请问您是如何详细操作的? @lingtengqiu
for example the train loss for total_loss as to 4.0 ? why this condition happen? do you divide batch_size ? please help me