Open harudaee opened 4 years ago
Thanks,
I use 'return x * Mf', but the code has not been modified at this time.
I will update the new code as soon as possible.
Thank U, sir.
-----Original Message----- From: "harudaee"notifications@github.com To: "asdf2kr/BAM-CBAM-pytorch"BAM-CBAM-pytorch@noreply.github.com; Cc: "Subscribed"subscribed@noreply.github.com; Sent: 2019-11-12 (화) 23:18:06 (GMT+09:00) Subject: [asdf2kr/BAM-CBAM-pytorch] Something wrong with code in attention.py (#1)
Excuse me, sir. I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x Mf)'. According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x Mf' . Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'. — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or unsubscribe.
Hello sir, I want to add this CBAM block in the YOLOv4 object detection model. Can you please give some idea how I can proceed with it, it will help me a lot. Thank you in advance !!
Excuse me, sir.
I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'.
According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' .
Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'.
Why Mf = 1 + self.sigmoid(Mc * Ms) instead of Mf = 1 + self.sigmoid(Mc + Ms), and where can I find an introduction to it
Excuse me, sir.
I read your file 'attention.py' and I find something wrong. In line 62, there is 'return x + (x * Mf)'.
According to the Eq.(2). in paper BAM: Bottleneck Attention Module, it should be 'return x * Mf' .
Because in the .py file, you have defined 'Mf = 1 + self.sigmoid(Mc * Ms)'.