Closed bmilde closed 4 years ago
I've implemented KL-Divergence and symmetric KL in ABX_src/abx_group_computation.py, will do a pull request soon! Also gives better results on my posteriorgrams than cosine/euclidean.
Hello @bmilde , as I understand this was solved with https://github.com/facebookresearch/libri-light/pull/39 ? Can I close the issue? Thanks!
Yes it can be closed!
Can you add KL-divergence as a distance metric to the ABX eval? This usually needed/used to compare pseudo-posteriorgrams. Thank you!