Thank you for you source code for attention-transfer, but I an not familiar with pytorch. I do not understand the interplotation's implementation. How it works? how to do the interpolation if the two feature maps's dimension are not the same? Can you explain it for me clearly?
Thank you for you source code for attention-transfer, but I an not familiar with pytorch. I do not understand the interplotation's implementation. How it works? how to do the interpolation if the two feature maps's dimension are not the same? Can you explain it for me clearly?