mli / new-docs

https://beta.mxnet.io/
12 stars 17 forks source link

Added KL Divergence loss tutorial #92

Closed thomelane closed 5 years ago

thomelane commented 5 years ago

A lot of confusion has been caused over how to use KL Divergence loss function. We need to document it, and this tutorial is an extension of @NRauschmayr's tutorial on loss functions, from which this tutorial will be linked. Shows difference between from_logits as True and False, and discussed various cases where divergence calculated could be wrong/different from true calculation.

thomelane commented 5 years ago

@NRauschmayr please review, thanks!

thomelane commented 5 years ago

thanks for the review @NRauschmayr!