fwang91 / residual-attention-network

Residual Attention Network for Image Classification
549 stars 212 forks source link

Confused of Interp layer #4

Open Queequeg92 opened 7 years ago

Queequeg92 commented 7 years ago

According to your paper, the "Interp" layer does a bilinear interpolation to upsample its previous layer's output. But "Interp" layer has two inputs in your implementation. I'm not very familiar with caffe. Could you provide some documentations of "Interp" layer in caffe? Is there any alternatives in tensorflow or pytorch?

ondrejbiza commented 7 years ago

The first input to the layer is the Tensor to be resized, the second input dictates the target size. In Tensorflow, you can use tf.image.resize_images with Bilinear interpolation.

Queequeg92 commented 7 years ago

Got it! Thanks!