google / seq2seq

A general-purpose encoder-decoder framework for Tensorflow
https://google.github.io/seq2seq/
Apache License 2.0
5.61k stars 1.3k forks source link

Pre-trained word embeddings #111

Open markmishaev opened 7 years ago

markmishaev commented 7 years ago

Does the framework support using of pre-trained word embeddings (Word2Vec or Glove)?

dennybritz commented 7 years ago

Not right now. For most tasks we've been using this for (translation, summarization, etc) there is enough data to learn embeddings from scratch. I'll tag this as a feature request, I'd imagine this is somewhat common for small datasets.

markmishaev commented 7 years ago

Thanks.

alexanderkjeldaas commented 7 years ago

I also need this.

markmishaev commented 7 years ago

Looks like I found a good example of how to load pre-trained embeddings here: https://github.com/Conchylicultor/DeepQA

sheerun commented 7 years ago

woudn't idempotent encoder do the thing? is one available?

KirtiBhandari-Reflektion commented 7 years ago

Much needed.