NVlabs / FAN

Official PyTorch implementation of Fully Attentional Networks
https://arxiv.org/abs/2204.12451
Other
469 stars 28 forks source link

fan_small_12_p16_224_se_attn Forward Issue #13

Open kiansierra opened 2 years ago

kiansierra commented 2 years ago

Hi there I receive the following error when running the forward method on models with FANBlock_SE TypeError: forward() missing 2 required positional arguments: 'H' and 'W'

An example of code that reproduces the issue

from FAN.models.fan import fan_small_12_p16_224_se_attn
import torch

model = fan_small_12_p16_224_se_attn()
model(torch.ones(2,3,224,224))

Open In Colab

Carl1998-CY commented 1 year ago

Is your problem solved? I'm having the same issue too

kiansierra commented 1 year ago

No but I believe this PR Would fix it https://github.com/NVlabs/FAN/pull/21