Closed huang-hz closed 1 year ago
In the meanwhile, i use the model of rain200l to test on rain100l, and get almost the same psnr and ssim as recoded in More Results matlab: Rain100L dataset PSNR: 42.446037 SSIM: 0.990438 scikit-learn: Rain100L dataset Average PSNR: 41.158583161145195 Average SSIM: 0.989074344786784
Nice work! I can run test on rain200l successfully, but it report error on loading model of rain100l. Is the Attention module removed when training for rain100l?
RuntimeError: Error(s) in loading state_dict for DRSformer: Missing key(s) in state_dict: "encoder_level1.0.attn.attn1", "encoder_level1.0.attn.attn2", "encoder_level1.0.attn.attn3", "encoder_level1.0.attn.attn4", "encoder_level1.1.attn.attn1", "encoder_level1.1.attn.attn2", "encoder_level1.1.attn.attn3", "encoder_level1.1.attn.attn4", "encoder_level1.2.attn.attn1", "encoder_level1.2.attn.attn2", "encoder_level1.2.attn.attn3", "encoder_level1.2.attn.attn4", "encoder_level1.3.attn.attn1", "encoder_level1.3.attn.attn2", "encoder_level1.3.attn.attn3", "encoder_level1.3.attn.attn4", "encoder_level2.0.attn.attn1", "encoder_level2.0.attn.attn2", "encoder_level2.0.attn.attn3", "encoder_level2.0.attn.attn4", "encoder_level2.1.attn.attn1", "encoder_level2.1.attn.attn2", "encoder_level2.1.attn.attn3", "encoder_level2.1.attn.attn4", "encoder_level2.2.attn.attn1", "encoder_level2.2.attn.attn2", "encoder_level2.2.attn.attn3", "encoder_level2.2.attn.attn4", "encoder_level2.3.attn.attn1", "encoder_level2.3.attn.attn2", "encoder_level2.3.attn.attn3", "encoder_level2.3.attn.attn4", "encoder_level2.4.attn.attn1", "encoder_level2.4.attn.attn2", "encoder_level2.4.attn.attn3", "encoder_level2.4.attn.attn4", "encoder_level2.5.attn.attn1", "encoder_level2.5.attn.attn2", "encoder_level2.5.attn.attn3", "encoder_level2.5.attn.attn4", "encoder_level3.0.attn.attn1", "encoder_level3.0.attn.attn2", "encoder_level3.0.attn.attn3", "encoder_level3.0.attn.attn4", "encoder_level3.1.attn.attn1", "encoder_level3.1.attn.attn2", "encoder_level3.1.attn.attn3", "encoder_level3.1.attn.attn4", "encoder_level3.2.attn.attn1", "encoder_level3.2.attn.attn2", "encoder_level3.2.attn.attn3", "encoder_level3.2.attn.attn4", "encoder_level3.3.attn.attn1", "encoder_level3.3.attn.attn2", "encoder_level3.3.attn.attn3", "encoder_level3.3.attn.attn4", "encoder_level3.4.attn.attn1", "encoder_level3.4.attn.attn2", "encoder_level3.4.attn.attn3", "encoder_level3.4.attn.attn4", "encoder_level3.5.attn.attn1", "encoder_level3.5.attn.attn2", "encoder_level3.5.attn.attn3", "encoder_level3.5.attn.attn4", "latent.0.attn.attn1", "latent.0.attn.attn2", "latent.0.attn.attn3", "latent.0.attn.attn4", "latent.1.attn.attn1", "latent.1.attn.attn2", "latent.1.attn.attn3", "latent.1.attn.attn4", "latent.2.attn.attn1", "latent.2.attn.attn2", "latent.2.attn.attn3", "latent.2.attn.attn4", "latent.3.attn.attn1", "latent.3.attn.attn2", "latent.3.attn.attn3", "latent.3.attn.attn4", "latent.4.attn.attn1", "latent.4.attn.attn2", "latent.4.attn.attn3", "latent.4.attn.attn4", "latent.5.attn.attn1", "latent.5.attn.attn2", "latent.5.attn.attn3", "latent.5.attn.attn4", "latent.6.attn.attn1", "latent.6.attn.attn2", "latent.6.attn.attn3", "latent.6.attn.attn4", "latent.7.attn.attn1", "latent.7.attn.attn2", "latent.7.attn.attn3", "latent.7.attn.attn4", "decoder_level3.0.attn.attn1", "decoder_level3.0.attn.attn2", "decoder_level3.0.attn.attn3", "decoder_level3.0.attn.attn4", "decoder_level3.1.attn.attn1", "decoder_level3.1.attn.attn2", "decoder_level3.1.attn.attn3", "decoder_level3.1.attn.attn4", "decoder_level3.2.attn.attn1", "decoder_level3.2.attn.attn2", "decoder_level3.2.attn.attn3", "decoder_level3.2.attn.attn4", "decoder_level3.3.attn.attn1", "decoder_level3.3.attn.attn2", "decoder_level3.3.attn.attn3", "decoder_level3.3.attn.attn4", "decoder_level3.4.attn.attn1", "decoder_level3.4.attn.attn2", "decoder_level3.4.attn.attn3", "decoder_level3.4.attn.attn4", "decoder_level3.5.attn.attn1", "decoder_level3.5.attn.attn2", "decoder_level3.5.attn.attn3", "decoder_level3.5.attn.attn4", "decoder_level2.0.attn.attn1", "decoder_level2.0.attn.attn2", "decoder_level2.0.attn.attn3", "decoder_level2.0.attn.attn4", "decoder_level2.1.attn.attn1", "decoder_level2.1.attn.attn2", "decoder_level2.1.attn.attn3", "decoder_level2.1.attn.attn4", "decoder_level2.2.attn.attn1", "decoder_level2.2.attn.attn2", "decoder_level2.2.attn.attn3", "decoder_level2.2.attn.attn4", "decoder_level2.3.attn.attn1", "decoder_level2.3.attn.attn2", "decoder_level2.3.attn.attn3", "decoder_level2.3.attn.attn4", "decoder_level2.4.attn.attn1", "decoder_level2.4.attn.attn2", "decoder_level2.4.attn.attn3", "decoder_level2.4.attn.attn4", "decoder_level2.5.attn.attn1", "decoder_level2.5.attn.attn2", "decoder_level2.5.attn.attn3", "decoder_level2.5.attn.attn4", "decoder_level1.0.attn.attn1", "decoder_level1.0.attn.attn2", "decoder_level1.0.attn.attn3", "decoder_level1.0.attn.attn4", "decoder_level1.1.attn.attn1", "decoder_level1.1.attn.attn2", "decoder_level1.1.attn.attn3", "decoder_level1.1.attn.attn4", "decoder_level1.2.attn.attn1", "decoder_level1.2.attn.attn2", "decoder_level1.2.attn.attn3", "decoder_level1.2.attn.attn4", "decoder_level1.3.attn.attn1", "decoder_level1.3.attn.attn2", "decoder_level1.3.attn.attn3", "decoder_level1.3.attn.attn4".
Thank you for discovering the error. This reason may be due to the incorrect pre-trained model we uploaded. We suggest that you retrain the DRSformer on the Rain100L dataset. We apologize for this mistake. We will correct this issue as soon as possible.
In the meanwhile, i use the model of rain200l to test on rain100l, and get almost the same psnr and ssim as recoded in More Results matlab: Rain100L dataset PSNR: 42.446037 SSIM: 0.990438 scikit-learn: Rain100L dataset Average PSNR: 41.158583161145195 Average SSIM: 0.989074344786784
We will recheck and update More Results. Thank you very much for pointing out this issue.
Thank you for your work.Why does the psnr of the validation set remain constant during the training process? Is it because there is no real-time processing of the validation set model using the generated model? So you still need to use the trained model at the end to test the image to calculate psnr, right?
Thank you for your work.Why does the psnr of the validation set remain constant during the training process? Is it because there is no real-time processing of the validation set model using the generated model? So you still need to use the trained model at the end to test the image to calculate psnr, right?
Note that we do not use MEFC for training Rain200L/100L and SPA-Data. If you need to retrain, please modify the file DRSformer_arch.py.
Can I understand that if Rain200H is trained, the psnr of the validation set will change with training?
Can I understand that if Rain200H is trained, the psnr of the validation set will change with training?
Yes.
In the meanwhile, i use the model of rain200l to test on rain100l, and get almost the same psnr and ssim as recoded in More Results matlab: Rain100L dataset PSNR: 42.446037 SSIM: 0.990438 scikit-learn: Rain100L dataset Average PSNR: 41.158583161145195 Average SSIM: 0.989074344786784
Why did I get similar errors on both the Rain200L test set and the Rain100L test set with the Rain200L pre-trained model? Is it the problem with the pre-trained model provided?
In the meanwhile, i use the model of rain200l to test on rain100l, and get almost the same psnr and ssim as recoded in More Results matlab: Rain100L dataset PSNR: 42.446037 SSIM: 0.990438 scikit-learn: Rain100L dataset Average PSNR: 41.158583161145195 Average SSIM: 0.989074344786784
Why did I get similar errors on both the Rain200L test set and the Rain100L test set with the Rain200L pre-trained model? Is it the problem with the pre-trained model provided?
Please modify the file DRSformer_arch.py. We do not use MEFC for training Rain200L.
I know what you mean. However, I used the pre-training model you provided, so the problem is that only Rain200 pre-training model has problems, and the other pre-training model has no problems at present, right? I haven't tested the other models yet, or do I need to change DRSformer_arch.py for testing as well?
I know what you mean. However, I used the pre-training model you provided, so the problem is that only Rain200 pre-training model has problems, and the other pre-training model has no problems at present, right? I haven't tested the other models yet, or do I need to change DRSformer_arch.py for testing as well?
The pre-trained models we provided have no problem. If you need to test Rain200L, you need to modify the model file before testing.
So when you test, you have to '#' comment out the code that you annotate?
So when you test, you have to '#' comment out the code that you annotate?
Now, we upload this file: DRSformer_arch_200L+SPA.py.
Nice work! I can run test on rain200l successfully, but it report error on loading model of rain100l. Is the Attention module removed when training for rain100l?
RuntimeError: Error(s) in loading state_dict for DRSformer: Missing key(s) in state_dict: "encoder_level1.0.attn.attn1", "encoder_level1.0.attn.attn2", "encoder_level1.0.attn.attn3", "encoder_level1.0.attn.attn4", "encoder_level1.1.attn.attn1", "encoder_level1.1.attn.attn2", "encoder_level1.1.attn.attn3", "encoder_level1.1.attn.attn4", "encoder_level1.2.attn.attn1", "encoder_level1.2.attn.attn2", "encoder_level1.2.attn.attn3", "encoder_level1.2.attn.attn4", "encoder_level1.3.attn.attn1", "encoder_level1.3.attn.attn2", "encoder_level1.3.attn.attn3", "encoder_level1.3.attn.attn4", "encoder_level2.0.attn.attn1", "encoder_level2.0.attn.attn2", "encoder_level2.0.attn.attn3", "encoder_level2.0.attn.attn4", "encoder_level2.1.attn.attn1", "encoder_level2.1.attn.attn2", "encoder_level2.1.attn.attn3", "encoder_level2.1.attn.attn4", "encoder_level2.2.attn.attn1", "encoder_level2.2.attn.attn2", "encoder_level2.2.attn.attn3", "encoder_level2.2.attn.attn4", "encoder_level2.3.attn.attn1", "encoder_level2.3.attn.attn2", "encoder_level2.3.attn.attn3", "encoder_level2.3.attn.attn4", "encoder_level2.4.attn.attn1", "encoder_level2.4.attn.attn2", "encoder_level2.4.attn.attn3", "encoder_level2.4.attn.attn4", "encoder_level2.5.attn.attn1", "encoder_level2.5.attn.attn2", "encoder_level2.5.attn.attn3", "encoder_level2.5.attn.attn4", "encoder_level3.0.attn.attn1", "encoder_level3.0.attn.attn2", "encoder_level3.0.attn.attn3", "encoder_level3.0.attn.attn4", "encoder_level3.1.attn.attn1", "encoder_level3.1.attn.attn2", "encoder_level3.1.attn.attn3", "encoder_level3.1.attn.attn4", "encoder_level3.2.attn.attn1", "encoder_level3.2.attn.attn2", "encoder_level3.2.attn.attn3", "encoder_level3.2.attn.attn4", "encoder_level3.3.attn.attn1", "encoder_level3.3.attn.attn2", "encoder_level3.3.attn.attn3", "encoder_level3.3.attn.attn4", "encoder_level3.4.attn.attn1", "encoder_level3.4.attn.attn2", "encoder_level3.4.attn.attn3", "encoder_level3.4.attn.attn4", "encoder_level3.5.attn.attn1", "encoder_level3.5.attn.attn2", "encoder_level3.5.attn.attn3", "encoder_level3.5.attn.attn4", "latent.0.attn.attn1", "latent.0.attn.attn2", "latent.0.attn.attn3", "latent.0.attn.attn4", "latent.1.attn.attn1", "latent.1.attn.attn2", "latent.1.attn.attn3", "latent.1.attn.attn4", "latent.2.attn.attn1", "latent.2.attn.attn2", "latent.2.attn.attn3", "latent.2.attn.attn4", "latent.3.attn.attn1", "latent.3.attn.attn2", "latent.3.attn.attn3", "latent.3.attn.attn4", "latent.4.attn.attn1", "latent.4.attn.attn2", "latent.4.attn.attn3", "latent.4.attn.attn4", "latent.5.attn.attn1", "latent.5.attn.attn2", "latent.5.attn.attn3", "latent.5.attn.attn4", "latent.6.attn.attn1", "latent.6.attn.attn2", "latent.6.attn.attn3", "latent.6.attn.attn4", "latent.7.attn.attn1", "latent.7.attn.attn2", "latent.7.attn.attn3", "latent.7.attn.attn4", "decoder_level3.0.attn.attn1", "decoder_level3.0.attn.attn2", "decoder_level3.0.attn.attn3", "decoder_level3.0.attn.attn4", "decoder_level3.1.attn.attn1", "decoder_level3.1.attn.attn2", "decoder_level3.1.attn.attn3", "decoder_level3.1.attn.attn4", "decoder_level3.2.attn.attn1", "decoder_level3.2.attn.attn2", "decoder_level3.2.attn.attn3", "decoder_level3.2.attn.attn4", "decoder_level3.3.attn.attn1", "decoder_level3.3.attn.attn2", "decoder_level3.3.attn.attn3", "decoder_level3.3.attn.attn4", "decoder_level3.4.attn.attn1", "decoder_level3.4.attn.attn2", "decoder_level3.4.attn.attn3", "decoder_level3.4.attn.attn4", "decoder_level3.5.attn.attn1", "decoder_level3.5.attn.attn2", "decoder_level3.5.attn.attn3", "decoder_level3.5.attn.attn4", "decoder_level2.0.attn.attn1", "decoder_level2.0.attn.attn2", "decoder_level2.0.attn.attn3", "decoder_level2.0.attn.attn4", "decoder_level2.1.attn.attn1", "decoder_level2.1.attn.attn2", "decoder_level2.1.attn.attn3", "decoder_level2.1.attn.attn4", "decoder_level2.2.attn.attn1", "decoder_level2.2.attn.attn2", "decoder_level2.2.attn.attn3", "decoder_level2.2.attn.attn4", "decoder_level2.3.attn.attn1", "decoder_level2.3.attn.attn2", "decoder_level2.3.attn.attn3", "decoder_level2.3.attn.attn4", "decoder_level2.4.attn.attn1", "decoder_level2.4.attn.attn2", "decoder_level2.4.attn.attn3", "decoder_level2.4.attn.attn4", "decoder_level2.5.attn.attn1", "decoder_level2.5.attn.attn2", "decoder_level2.5.attn.attn3", "decoder_level2.5.attn.attn4", "decoder_level1.0.attn.attn1", "decoder_level1.0.attn.attn2", "decoder_level1.0.attn.attn3", "decoder_level1.0.attn.attn4", "decoder_level1.1.attn.attn1", "decoder_level1.1.attn.attn2", "decoder_level1.1.attn.attn3", "decoder_level1.1.attn.attn4", "decoder_level1.2.attn.attn1", "decoder_level1.2.attn.attn2", "decoder_level1.2.attn.attn3", "decoder_level1.2.attn.attn4", "decoder_level1.3.attn.attn1", "decoder_level1.3.attn.attn2", "decoder_level1.3.attn.attn3", "decoder_level1.3.attn.attn4".