TrystAI / restorers

Restorers provide out-of-the-box TensorFlow implementations of SoTA image and video restoration models for tasks such as low-light enhancement, denoising, deblurring, super-resolution, etc.
https://wandb.me/low-light
36 stars 0 forks source link

Training MirNetv2 and NAFNet in a Knowledge Distillation setup #48

Open Samarendra109 opened 1 year ago

Samarendra109 commented 1 year ago

In the original paper (https://arxiv.org/abs/2204.04676) the authors find the optimal number of blocks to be 36. It has the minimal increase in latency and great increase in performance. So we need to train NAFNet with that setting. In the initial tests we see that NAFNet is able to beat MirnetV2 even with just 9 blocks (On LOL Dataset). So there is reason to believe a distilled NAFNet with 9 blocks trained from the teacher model with 36 blocks can give excellent performance. This needs to be tested.

soumik12345 commented 1 year ago

TODOs: