Closed a11apurva closed 6 years ago
Glove are the word embeddings, which are kept fixed during training and fed into the encoder. AllNLI is an encoder that has been trained on SNLI+MultiNLI. If you want to use fasttext embeddings, on your own corpus, you will need to train a new encoder from scratch using those embeddings and on that data.
I am not able to understand why both AllNLI and Glove model is required.
If I want to load my own model generated by fasttext, which one should I replace?
P.S My corpus is not english.