Open zehuanzhang opened 3 years ago
In the GeoMAN.py, Line 262 to Line 273:
local_x = local_attn local_inp global_x = global_attn global_inp
cell_output, state = cell(tf.concat([local_x, global_x], axis=1), state)
with tf.variable_scope('local_spatial_attn'): local_attn = local_attention(state) with tf.variable_scope('global_spatial_attn'): global_attn = global_attention(state) attn_weights.append((local_attn, global_attn))
Does this only consider cell state when calculating attention? In the paper Equation (1), cell state and hidden state are concatenated?
The variable "state" actually contains both the cell state and the hidden state.
In the GeoMAN.py, Line 262 to Line 273:
multiply attention weights with the original input
local_x = local_attn local_inp global_x = global_attn global_inp
Run the BasicLSTM with the newly input
cell_output, state = cell(tf.concat([local_x, global_x], axis=1), state)
Run the attention mechanism.
with tf.variable_scope('local_spatial_attn'): local_attn = local_attention(state) with tf.variable_scope('global_spatial_attn'): global_attn = global_attention(state) attn_weights.append((local_attn, global_attn))
Does this only consider cell state when calculating attention? In the paper Equation (1), cell state and hidden state are concatenated?