The description given in the readme about entropy seems off.
Experiments
Overall entropy
On each step the overall entropy decreases
Verification: in Model.cs inside the for loop of the method Run(), compute sum of entropies before Observe() and after Propagate(), and compare the values.
Result: sometimes the sum before is less than the sum after.
Comment: the overall entropy can increase after an observe-propagate step.
Node entropy
Note that the entropy of any node can't increase during the propagation phase
Verification: in Model.cs at the end of Ban(), compare the entropy before and after update (excluding NaN values, and optionally adding a boolean to test if propagation phase).
Result: sometimes the entropy before is less than the entropy after update.
Comment: the entropy of a node can increase during the propagation phase
Possibilities
possibilities are not arising, but can be canceled
This sounds more like some kind of cardinality rather than entropy.
Verification: in Model.cs in Init(), initialize startingEntropy with T, and in Ban() update entropies[i] with sumsOfOnes[i].
Result: The generated images are similar, only Skyline and Platformer look more uniform (visual inspection, and it depends on random seed).
Conclusion
The entropy formula used is not monotonic wrt individual weights, and doesn't measure the number of possibilities but the uniformity of the distribution. Adding a weight that is very small or very large compared to the existing weights can decrease the entropy, and consequently removing such a weight would increase the entropy.
Yeah, that's right, propagation can turn the distribution (0.01, 0.01, 0.98) into (0.5, 0.5, 0.0), which has a higher entropy. Thanks, I'll correct this.
The description given in the readme about entropy seems off.
Experiments
Overall entropy
Verification: in Model.cs inside the for loop of the method Run(), compute sum of entropies before Observe() and after Propagate(), and compare the values.
Result: sometimes the sum before is less than the sum after.
Comment: the overall entropy can increase after an observe-propagate step.
Node entropy
Verification: in Model.cs at the end of Ban(), compare the entropy before and after update (excluding NaN values, and optionally adding a boolean to test if propagation phase).
Result: sometimes the entropy before is less than the entropy after update.
Comment: the entropy of a node can increase during the propagation phase
Possibilities
This sounds more like some kind of cardinality rather than entropy.
Verification: in Model.cs in Init(), initialize startingEntropy with T, and in Ban() update entropies[i] with sumsOfOnes[i].
Result: The generated images are similar, only Skyline and Platformer look more uniform (visual inspection, and it depends on random seed).
Conclusion
The entropy formula used is not monotonic wrt individual weights, and doesn't measure the number of possibilities but the uniformity of the distribution. Adding a weight that is very small or very large compared to the existing weights can decrease the entropy, and consequently removing such a weight would increase the entropy.