summerlvsong / Aggregation-Cross-Entropy

Aggregation Cross-Entropy for Sequence Recognition. CVPR 2019.
303 stars 60 forks source link

KL Divergence #15

Open viig99 opened 4 years ago

viig99 commented 4 years ago

In the paper in https://arxiv.org/pdf/1904.08364.pdf sec 3.2 it is mentioned: "We borrow the concept of cross-entropy from information theory, which is designed to measure the “distance” between two probability distributions."

Wont' kl-divergence be a better way to measure the distance between both probability distributions ?