crazydonkey200 / neural-symbolic-machines

Neural Symbolic Machines is a framework to integrate neural networks and symbolic representations using reinforcement learning, with applications in program synthesis and semantic parsing.
Apache License 2.0
375 stars 69 forks source link

Is sequence entropy defined only each action in a sequence? #6

Closed agarwl closed 6 years ago

agarwl commented 6 years ago

https://github.com/crazydonkey200/neural-symbolic-machines/blob/231bdc38ea2d84c3dcc883c3fd5cd2107f3d53b4/nsm/graph_factory.py#L347

Should there be another reduce_sum over sequence length dimension for sequence entropy term calculation?

crazydonkey200 commented 6 years ago

The name might be a bit confusing. Similar to seq_logits and seq_probs, seq_entropy refers to sequences of entropy values, instead of the entropy values of the sequences, so there isn't another reduce_sum.

I just made it a SeqTensor to make this most explicit. Also note that seq_entropy was added just for debugging purpose (for example, printing out the entropies at each step).