Ephemeral182 / UDR-S2Former_deraining

[ICCV'23] Sparse Sampling Transformer with Uncertainty-Driven Ranking for Unified Removal of Raindrops and Rain Streaks
https://ephemeral182.github.io/UDR_S2Former_deraining/
127 stars 7 forks source link

Regarding the issue with testing #5

Closed Lecxxx closed 6 months ago

Lecxxx commented 6 months ago

Hello, thank you for your excellent work. However, there are some issues with the provided code that have troubled me: The saved model cannot be used when loaded in the test.

I would be grateful if you could provide me with the relevant assistance.

Ephemeral182 commented 6 months ago

Hi, we utilize PyTorch_lightning for training, where checkpoints are saved with other hyperparameters in *.ckpt file. The underlying code like pytorch_lightning makes it a bit difficult for some people. So, you need to first take out the checkpoint of the model from the .ckpt file (just like my .pth checkpoints). Here is the code for this operation:

ckpt = torch.load('your ***.ckpt**')
new_state_dict = OrderedDict()
    for k in ckpt['state_dict']:
                    print(k)
                    if k[:6] != 'model.':
                        continue
                    name = k[6:]
                    new_state_dict[name] = ckpt['state_dict'][k]
**your model**.load_state_dict(new_state_dict,strict=False)
Owen718 commented 6 months ago

The code you provided is too poor. The saved model cannot be used when loaded in the test, and the original training code is also very complicated and unclear. Please provide a well-improved and readable code.

Additionally, in PyTorchLightning, the trainer supports direct validation through the trainer.validate function. I recommend that you first consult the PyTorchLightning documentation for more details.

Lecxxx commented 6 months ago

@Owen718 Hello, I have already solved the problem using my own method. Since the training code can load .ckpt files, I had to forcibly add "--trainer.num_sanity_val_steps 500" in the training file code to complete the testing.

However, the fact is that the test code provided by the author cannot directly use the .ckpt files saved during training. I just raised my issue and expressed doubts about the explanation of the code. Of course, all the learning files in my repository are open for review, and I would not be displeased with you knowing my real information.

Lecxxx commented 6 months ago

@Owen718 Additionally, if you think there is any impoliteness in my question, I am willing to modify it to make you feel comfortable. However, the issues with the code are an undeniable fact. You cannot assume that the readers are facing problems due to unfamiliarity with PyTorchLightning.

Owen718 commented 6 months ago

@Owen718 Additionally, if you think there is any impoliteness in my question, I am willing to modify it to make you feel comfortable. However, the issues with the code are an undeniable fact. You cannot assume that the readers are facing problems due to unfamiliarity with PyTorchLightning.

You're right, we can't guarantee that everyone who uses our code has the same level of expertise as we do.

Sharing academic work in open source is to increase its impact and benefit the community, and it certainly doesn't have the robustness of production-level code. I hope that your team can make as much of your work open source as possible in the future. Thank you.

Lecxxx commented 6 months ago

@Owen718 Even so, your response still does not acknowledge the existence of issues with the testing aspect of the code, which leaves me confused. Furthermore, the content of your reply does not seem as polite and respectful as you mentioned earlier. Bringing up topics unrelated to this work and discussion only causes discomfort. Since I have already resolved my issue, I see no need for further discussion. I wish you success in your future work.

Owen718 commented 6 months ago
image

I don’t need to show respect to someone who speaks to strangers in such a tone.