ryankiros / skip-thoughts

Sent2Vec encoder and training code from the paper "Skip-Thought Vectors"
2.05k stars 544 forks source link

Key Error #55

Open rakshajalan opened 7 years ago

rakshajalan commented 7 years ago

size of my vocab is 175.But why is it looking for 176? Error: model = tools.load_model(embed_map) File "/home/raksha/prog/skip-thoughts-master/training/tools.py", line 74, in load_model table = lookup_table(options, embed_map, worddict, word_idict, f_emb) File "/home/raksha/prog/skip-thoughts-master/training/tools.py", line 163, in lookup_table wordvecs = get_embeddings(options, word_idict, f_emb) File "/home/raksha/prog/skip-thoughts-master/training/tools.py", line 184, in get_embeddings d[word_idict[i]] = ff KeyError: 176