Open dmuqlzhang opened 5 years ago
Im not understand about kernel_size=7 or 14 before the last FCN layer in attention module, its not reflect in the paper(DIAA), could you please give me some explain if you free? thanks a lot!
m not understand about kernel_size=7 or 14 before the last FCN layer in attention module, it
and I want to understand why let kernel_size == feature map size in attention last layer?
I
m not understand about kernel_size=7 or 14 before the last FCN layer in attention module, it
s not reflect in the paper(DIAA), could you please give me some explain if you free? thanks a lot!