ColinTaoZhang / SANet

6 stars 2 forks source link

Which implementation of `CharbonnierLoss` was used in SANet? #1

Open sttzr opened 1 year ago

sttzr commented 1 year ago

Hi! As part of my bachelor thesis, I am reading into your research and trying to understand the source code. I managed to install almost all the dependencies needed:

pip install rawpy exifread natsort git+https://github.com/ildoonet-gradual-warmup-lr.git

But this one is still missing:

https://github.com/ColinTaoZhang/SANet/blob/7778df59c8878a64765fe91b9adacab43a738241/train_denoising.py#L27

Is this a local reference to your own implementation not included in this repo? Or can it be installed via pip? What was the reason you were using Charbonnier instead of normal L1 loss?

Thanks a lot for your great work and your help!

sttzr commented 1 year ago

Did you use this one? https://github.com/victorca25/traiNNer/blob/1d25dcdb4c6a3a1f63589bb4f724fa4857009f31/codes/models/modules/loss.py#L47-L57

ColinTaoZhang commented 1 year ago

Did you use this one? https://github.com/victorca25/traiNNer/blob/1d25dcdb4c6a3a1f63589bb4f724fa4857009f31/codes/models/modules/loss.py#L47-L57

The link of Charbonnier loss is right.

Actually, we employ L1 loss rather than Charbonnier loss in our experiments, and we have not compared the performance of these losses.