vineetm / ell-881-2018-deep-learning

Course Materials for ELL 881 2018: Fundamentals of Deep Learning
9 stars 7 forks source link

Project#1: For tasks NLI and NMT #28

Open pushpendradahiya opened 6 years ago

pushpendradahiya commented 6 years ago

For sequence encoding tasks like NLI and NMT, the encoder gives vector representation for sentence. My doubt is that, are we supposed to use pre-trained embeddings of words to get these sentence embeddings or not. If not, then is it possible to learn words embeddings from sequence task like NLI? Even the paper github repo shows that they have use 300 dim GloVe vectors for NLI task.

prateek27 commented 6 years ago

I think you can't use pre-trained word embedding, those will get automatically trained. In assignment Embedding Size is mentioned as 256 but Glove Vectors have 300Dim. So you can't use that as well.

vineetm commented 6 years ago

@pushpendradahiya, @prateek27 is right. Word embeddings will get automatically trained as part of the Training Tasks. They should be initialized randomly, as discussed in lectures.