sergeywong / cp-vton

Reimplemented code for "Toward Characteristic-Preserving Image-based Virtual Try-On Network"
MIT License
474 stars 182 forks source link

Train Loss Oscillation #17

Closed Daisy007girl closed 5 years ago

Daisy007girl commented 5 years ago

When I train the 2 stages, I find that the loss changes in a small oscillation from beginning to end even if it falls in a general. I think it is unrelated to the fix of learning rate for I just use the origin learing rate in the code and the loss keep falling in the overall trend. Why is the loss always jumping?

solitarysandman commented 5 years ago

Because you're training in batches? Please look up how mini-batch gradient descent works. https://engmrk.com/mini-batch-gd/