Hi Sandeep,
I have a question about the input word embeddings. As you described in READEME, the pre-trained GLOVE is used. But in the paper, the word embeddings were learned. If I understand correctly, GLOVE is only used in the case if we want to expand the Vocabulary. When generating sentence representations, the model still uses the learned word embeddings. Is this right?
Yes exactly! Vocab expansion will just linearly map the embeddings of words that are in the GloVe dictionary (but not in my model's dictionary) to the space of my learned model's word embeddings.
Hi Sandeep, I have a question about the input word embeddings. As you described in READEME, the pre-trained GLOVE is used. But in the paper, the word embeddings were learned. If I understand correctly, GLOVE is only used in the case if we want to expand the Vocabulary. When generating sentence representations, the model still uses the learned word embeddings. Is this right?