facebookresearch / libri-light

dataset for lightly supervised training using the librivox audio book recordings. https://librivox.org/.
MIT License
480 stars 78 forks source link

Feature request: KL-divergence in ABX computations #37

Closed bmilde closed 4 years ago

bmilde commented 4 years ago

Can you add KL-divergence as a distance metric to the ABX eval? This usually needed/used to compare pseudo-posteriorgrams. Thank you!

bmilde commented 4 years ago

I've implemented KL-Divergence and symmetric KL in ABX_src/abx_group_computation.py, will do a pull request soon! Also gives better results on my posteriorgrams than cosine/euclidean.

bmilde commented 4 years ago

https://github.com/facebookresearch/libri-light/pull/39

eugene-kharitonov commented 4 years ago

Hello @bmilde , as I understand this was solved with https://github.com/facebookresearch/libri-light/pull/39 ? Can I close the issue? Thanks!

bmilde commented 4 years ago

Yes it can be closed!