Closed amttar closed 5 years ago
lm_embeddings holds the embeddings for all layers, starting with word embeddings at the lowest layer.
word_emb = ops['lm_embeddings'][:, 0, :, :512] # word embeddings are duplicated for forward/backward
lstm_outputs1 = ops['lm_embeddings'][:, 1, :, :]
lstm_outputs2 = ops['lm_embeddings'][:, 2, :, :]
Thanks @matt-peters for the clarification, where is this info listed?
When using pre-trained models, I get an output dictionary different from the output dictionary explained in the published tf.hub model My output dictionary's signature is
while the output dictionary from
tf hub
contains:how can I access the
word_emb, lstm_outputs1, lstm_outputs2 ..
fields in the output dictionary? I am following the usage example to cache a dataset from this link