TomTomTommi / HiNet

Official PyTorch implementation of "HiNet: Deep Image Hiding by Invertible Network" (ICCV 2021)
170 stars 36 forks source link

Something about loss function. #11

Open kerryhhh opened 2 years ago

kerryhhh commented 2 years ago

Thank you for sharing your code! But I find out that the hyperparameter of loss function(lamda_reconstruction and lamda_low_frequency) in your code is different from the paper, which one I should use?

TomTomTommi commented 2 years ago

Hi, thanks for your interest. The hyperparameter depends on the tradeoff between the quality of concealing and recovering. In the first stage, all hyperparameters can be set to 1 until the network converges. Then, you could finetune the network with different lambda according to the performance. For example, if you prefer a higher reconstruction quality, then set lamda_reconstruction higher.

fkeufss commented 2 years ago

Thank you for sharing your code. I am trying your code and I do find the loss explosion problem. Do you know the inherent reason of it? Is there any better solution instead of restarting training with lower learning rate every time manually?