suriyadeepan / practical_seq2seq

A simple, minimal wrapper for tensorflow's seq2seq module, for experimenting with datasets rapidly
http://suriyadeepan.github.io/2016-12-31-practical-seq2seq/
GNU General Public License v3.0
569 stars 270 forks source link

Add vocab words then resume training? #7

Closed dotjrt closed 7 years ago

dotjrt commented 7 years ago

Is there a way to restore from a previous checkpoint after adding new words to the vocabulary files?

suriyadeepan commented 7 years ago

I don't think it is possible. Adding new words would mess up the model, since the embedding layer and the softmax's in the decoder, depend on the size of vocabulary.