hongyuanyu / SPAN

Swift Parameter-free Attention Network for Efficient Super-Resolution
Apache License 2.0
135 stars 6 forks source link

Re-parametrization and torch warnings #4

Open neosr-project opened 11 months ago

neosr-project commented 11 months ago

Hi! Thanks for your work. I ported the network to neosr, however the re-parametrization creates torch warnings, as discussed on issue 16:

2023-12-11 03:47:37,177 WARNING: Params conv_1.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params conv_1.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_1.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,177 WARNING: Params block_2.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_2.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_2.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_2.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_2.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_2.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_3.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_4.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,178 WARNING: Params block_4.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_4.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_4.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_4.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_4.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_5.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_6.c1_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_6.c1_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_6.c2_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,179 WARNING: Params block_6.c2_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,180 WARNING: Params block_6.c3_r.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,180 WARNING: Params block_6.c3_r.eval_conv.bias will not be optimized.
2023-12-11 03:47:37,180 WARNING: Params conv_2.eval_conv.weight will not be optimized.
2023-12-11 03:47:37,180 WARNING: Params conv_2.eval_conv.bias will not be optimized.

Is there any way to prevent it? Setting self.training to True doesn't seem to solve the issue, it looks like eval_conv is still being sent to the optimizer. Thanks in advance.

neosr-project commented 11 months ago

Adding a condition to self.update_params() solved the warning:

        self.training = training

        if self.training is False:
            self.eval_conv.weight.requires_grad = False
            self.eval_conv.bias.requires_grad = False
            self.update_params()

Where self.training is a bool which gets the value from the options file:

from pathlib import Path
from neosr.utils.options import parse_options

# initialize options parsing
root_path = Path(__file__).parents[2]
opt, args = parse_options(root_path, is_train=True)
# set variable for training mode
if 'train' in opt['datasets']:
    training = True
else:
    training = False