Open njusegzyf opened 7 years ago
In addition to the original question: Is there a way to harness the new AttentionWrapper for sequence classification (meaning vector)? E.g. as a words-to-sentence reduction step. I mostly refer to the Hierarchical Attention Networks for Document Classification paper. This will be very helpful.
Hi,
I'm trying to use this model in research. I want to get the content/meaning vector produced by the encoder which represents the sentence meaning during inference. However, I found it is hard to get the meaning vector. The only way I can find is to using “tf.Print” to print data when evaluating “encoder_state” which is a local variable in the “build_graph” method of a model. I failed to use “tf.py_func” to attach extra process to “encoder_state” as following, which cause error : ttributeError: 'tuple' object has no attribute 'encode'.
Can you give some suggestions for extracting the meaning vector? Thanks a lot.