SImilar to skip-gram in your typical word2vec. However in this case the sentence is passed through a RNN encoder, whose output is fed simultaneously into:
a forward thought RNN decoder : predict the next sentence
a backward thought RNN decoder : predict the previous sentence
SImilar to skip-gram in your typical word2vec. However in this case the sentence is passed through a RNN encoder, whose output is fed simultaneously into: