Open aravindsrinivas opened 6 years ago
https://github.com/brain-research/self-attention-gan/blob/ad9612e60f6ba2b5ad3d3340ebae60f724636d75/non_local.py#L77
Is there a mistake in the reshape operation linked above? Shouldn't it be
attn_g = tf.reshape(attn_g, [batch_size, h // 2, w // 2, num_channels // 2]) attn_g = tf.depth_to_space(attn_g, [batch_size, h, w, num_channels // 8]) attn_g = sn_conv1x1(attn_g, num_channels, update_collection, init, 'sn_conv_attn')
instead of attn_g = tf.reshape(attn_g, [batch_size, h, w, num_channels // 2])?
attn_g = tf.reshape(attn_g, [batch_size, h, w, num_channels // 2])
https://github.com/brain-research/self-attention-gan/blob/ad9612e60f6ba2b5ad3d3340ebae60f724636d75/non_local.py#L77
Is there a mistake in the reshape operation linked above? Shouldn't it be
instead of
attn_g = tf.reshape(attn_g, [batch_size, h, w, num_channels // 2])
?