Closed yufengwhy closed 6 years ago
https://github.com/kevinzakka/recurrent-visual-attention/blob/master/model.py#L110
This line should be deleted? Because log_pi is a vector of length (B,) in the last line, we dont need sum by dim=1
You have B values each with 2 values (x, y). Read the comments, it explains why we sum along the dimension 1.
@kevinzakka Thank you very much for your code to understand the paper, but some small querstions:
https://github.com/kevinzakka/recurrent-visual-attention/blob/master/model.py#L110
This line should be deleted? Because log_pi is a vector of length (B,) in the last line, we dont need sum by dim=1