nie-lang / UnsupervisedDeepImageStitching

TIP2021 - Unsupervised deep image stitching network
334 stars 50 forks source link

The loss suddenly becomes larger #27

Closed csm-coder closed 2 years ago

csm-coder commented 2 years ago

hello,thanks for your codes. when i train the H_model, always, the loss suddenly becomes so larger. and it then cannot be lowered. can you give me some advice? here are some of my loss Loss curves: 1 2

nie-lang commented 2 years ago

Well, I do not meet this problem when I do it in my dataset. (I guess your dataset contains abundant low-overlap image pairs which can make this unuspervised method hard to train)

Maybe you can pre-train the model in a synthetic dataset first, and then fine-tune it on your own dataset.

csm-coder commented 2 years ago

Actually i train it on the coco dataset. when i set the batchsize = 2, it doesn't have this problem。but if i change the loss weight, it also have this problem. have you try other loss weight or learn_rate? it seams the model is a little frail。

nie-lang commented 2 years ago

In fact, I have tried different weights for the supervised version of this deep homography model. I found the loss weights should meet: w{loss1} > w{loss2} >w_{loss3} (see line59 at https://github.com/nie-lang/UnsupervisedDeepImageStitching/blob/main/ImageAlignment/Codes/train_H.py).

For example, I set w{loss1} = w{loss2} =w_{loss3} for the supervised version, but it seems hard to train. Because the lower pyramid should be set larger weight since its result affect the higher pyramid layer directly. For the unsupervised version, the regulation is the same.

csm-coder commented 2 years ago

thanks a lot for your reply。 in fact, i am just trying to set w{loss1} = w{loss2} =w_{loss3}, and it truly hard to train as you said.