cfzd / FcaNet

FcaNet: Frequency Channel Attention Networks
MIT License
508 stars 100 forks source link

dct_h and dct_w #37

Open myasser63 opened 2 years ago

myasser63 commented 2 years ago

How can I set dct_h and dct_w if i want to add FCA layer into another model. My feature maps for the layer I want to inset Fca layer are 160x160, 80x80, 40x40, 20x20

Please advise.

cfzd commented 2 years ago

@myasser63 You can directly add the FCA layer without any modification. The feature map's size would be addressed automatically as here: https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/layer.py#L54-L55

myasser63 commented 2 years ago

So I should leave dct_h and dct_w like that or set to feature maps sizes.

self.FCA = MultiSpectralAttentionLayer(in_channels, self.dct_h, self.dct_w)

cfzd commented 2 years ago

Whatever you want. You can set it according to your preferences or use the settings as ours: https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/fcanet.py#L19 https://github.com/cfzd/FcaNet/blob/aa5fb63505575bb4e4e094613565379c3f6ada33/model/fcanet.py#L29

myasser63 commented 2 years ago

I am trying this way and getting this:

  self.FCA =  MultiSpectralAttentionLayer(c1, c2wh[c1], c2wh[c1])

Error: RuntimeError: adaptive_avg_pool2d_backward_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True)'. You can turn off determinism just for this operation, or you can use the 'warn_only=True' option, if that's acceptable for your application. You can also file an issue at https://github.com/pytorch/pytorch/issues

cfzd commented 2 years ago

@myasser63 As the error says, it's a problem of adaptive_avg_pool2d. You can either just ignore it by:

torch.use_deterministic_algorithms(True, warn_only=True)

or you can turn off determinism by:

torch.use_deterministic_algorithms(False)