In 'models/graph_memory.py', line 274. i.e., the forward function of class ' Memory'.
m_out_all[:, :, x, :, :] = hiden_state
It seems that the updated memory cell m_out_all will not influence the next graph propagation, beecause according to line 258, m_out_all = torch.cat((m_out, q_out.unsqueeze(2)),dim=2).contiguous() # B, D_o, T+1, H, W, the variable m_out_all is related to the initial memory state m_out instead of the refined m_out_all from former propagations.
In 'models/graph_memory.py', line 274. i.e., the forward function of class ' Memory'.
m_out_all[:, :, x, :, :] = hiden_state
It seems that the updated memory cell
m_out_all
will not influence the next graph propagation, beecause according to line 258,m_out_all = torch.cat((m_out, q_out.unsqueeze(2)),dim=2).contiguous() # B, D_o, T+1, H, W
, the variablem_out_all
is related to the initial memory statem_out
instead of the refinedm_out_all
from former propagations.Could you please answer my question?