HsinJhao / hsinjhao.github.io

HsinJhao's Blogs
0 stars 0 forks source link

KL散度(Kullback-Leibler Divergence)介绍及详细公式推导 | HsinJhao's Blogs #1

Open HsinJhao opened 5 years ago

HsinJhao commented 5 years ago

https://hsinjhao.github.io/2019/05/22/KL-DivergenceIntroduction/#more

KL散度简介KL散度的概念来源于概率论和信息论中。KL散度又被称为:相对熵、互熵、鉴别信息、Kullback熵、Kullback-Leible散度(即KL散度的简写)。在机器学习、深度学习领域中,KL散度被广泛运用于变分自编码器中(Variational AutoEncoder,简称VAE)、EM算法、GAN网络中。 KL散度定义KL散度的定义是建立在熵(Entropy)的基础上的。此处以离散随

liyang-good commented 3 years ago

$DKL(p||q)=∑_{i=1}^n p(x)* \log \frac {p(x)}{q(x)}$ i表示的是什么

DuskSwan commented 6 months ago

👍👍👍

sichenyong commented 5 months ago

讲的太清晰了!

jxzhanggg commented 5 months ago

写的太好了,666