Entropy of distribution P is , which reflects the amount of uncertainty in P. Uniform distributions always have the largest entropy.
If we do not have prior knowledge about P and guess it to be Q, then we actually add extra uncertainty and have cross entropy as . In another aspect, cross entropy itself is a good alternative to MSE loss with sigmoid function as demonstrated.
The discrepancy between and
is relative entropy, also known as KL divergence, formulated as
.
KL divergence is non-negative proved by using Jensen’s Inequality. Besides, KL divergence is asymmetric (). However, we can define a symmetric variant as
. More properties can be referred here.
KL Divergence
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。