Closed garyliu0816 closed 5 years ago
This situation is right. The reason is, that the weight of all loss terms is continuously increased (but at different rates and over different intervals). For example, the weight of the cross-reconstruction loss is more than doubled from epoch 20 to 70.
Thank you for your answer. If the loss continuously increase, how can I determine when to stop training?
This situation is right. The reason is, that the weight of all loss terms is continuously increased (but at different rates and over different intervals). For example, the weight of the cross-reconstruction loss is more than doubled from epoch 20 to 70.
You can do that via trial and error on the validation set. I have experienced that the VAE-training is very robust to overfitting, and that the performance does not get worse with longer training times. For example, it makes a difference if you train 40 or 100 epochs (the performance will increase), but there is not much difference in performance between 100 and 1000 epochs (overall performance is basically stable). In effect, it means that trying out some different number of epochs should suffice to know when to stop training.
Thanks again, this work is awesome.
when using the cmd to train the model:
It works fine before 10 epoch, but then the loss increase from 3358 to 14401.
Is this situation right?