ExplorerFreda / Structured-Self-Attentive-Sentence-Embedding

An open-source implementation of the paper ``A Structured Self-Attentive Sentence Embedding'' (Lin et al., ICLR 2017).
GNU General Public License v3.0
432 stars 97 forks source link

global pooling layer #7

Open pinkfloyd06 opened 6 years ago

pinkfloyd06 commented 6 years ago

Hello,

Thank you for your work.

l have a question related to your global pooling layer.

Is it here where it is implemented ? https://github.com/ExplorerFreda/Structured-Self-Attentive-Sentence-Embedding/blob/master/models.py#L50

Thank you