Closed agarwl closed 6 years ago
The name might be a bit confusing. Similar to seq_logits and seq_probs, seq_entropy refers to sequences of entropy values, instead of the entropy values of the sequences, so there isn't another reduce_sum.
I just made it a SeqTensor to make this most explicit. Also note that seq_entropy was added just for debugging purpose (for example, printing out the entropies at each step).
https://github.com/crazydonkey200/neural-symbolic-machines/blob/231bdc38ea2d84c3dcc883c3fd5cd2107f3d53b4/nsm/graph_factory.py#L347
Should there be another reduce_sum over sequence length dimension for sequence entropy term calculation?