这个函数很重要:

function KL = kldiv(varValue,pVect1,pVect2,varargin)

%KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.
% KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two
% distributions specified over the M variable values in vector X. P1 is a
% length-M vector of probabilities representing distribution 1, and P2 is a
% length-M vector of probabilities representing distribution 2. Thus, the
% probability of value X(i) is P1(i) for distribution 1 and P2(i) for
% distribution 2. The Kullback-Leibler divergence is given by:
%
% KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]
%
% If X contains duplicate values, there will be an warning message, and these
% values will be treated as distinct values. (I.e., the actual values do
% not enter into the computation, but the probabilities for the two
% duplicate values will be considered as probabilities corresponding to
% two unique values.) The elements of probability vectors P1 and P2 must
% each sum to 1 +/- .00001.
%
% A "log of zero" warning will be thrown for zero-valued probabilities.
% Handle this however you wish. Adding 'eps' or some other small value
% to all probabilities seems reasonable. (Renormalize if necessary.)
%
% KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler
% divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic
% (2001).
%
% KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by
% [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article
% for "Kullback朙eibler divergence". This is equal to 1/2 the so-called
% "Jeffrey divergence." See Rubner et al. (2000).
%
% EXAMPLE: Let the event set and probability sets be as follow:
% X = [1 2 3 3 4]';
% P1 = ones(5,1)/5;
% P2 = [0 0 .5 .2 .3]' + eps;
%
% Note that the event set here has duplicate values (two 3's). These
% will be treated as DISTINCT events by KLDIV. If you want these to
% be treated as the SAME event, you will need to collapse their
% probabilities together before running KLDIV. One way to do this
% is to use UNIQUE to find the set of unique events, and then
% iterate over that set, summing probabilities for each instance of
% each unique event. Here, we just leave the duplicate values to be
% treated independently (the default):
% KL = kldiv(X,P1,P2);
% KL =
% 19.4899
%
% Note also that we avoided the log-of-zero warning by adding 'eps'
% to all probability values in P2. We didn't need to renormalize
% because we're still within the sum-to-one tolerance.
%
% REFERENCES:
% 1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley,
% 1991.
% 2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler
% distance." IEEE Transactions on Information Theory (Submitted).
% 3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's
% distance as a metric for image retrieval." International Journal of
% Computer Vision, 40(2): 99-121.
% 4) Kullback朙eibler divergence. Wikipedia, The Free Encyclopedia.
%
% See also: MUTUALINFO, ENTROPY
if ~isequal(unique(varValue),sort(varValue)),
warning('KLDIV:duplicates','X contains duplicate values. Treated as distinct values.')
end
if ~isequal(size(varValue),size(pVect1)) || ~isequal(size(varValue),size(pVect2)),
error('All inputs must have same dimension.')
end
% Check probabilities sum to 1:
if (abs(sum(pVect1) - 1) > .00001) || (abs(sum(pVect2) - 1) > .00001),
error('Probablities don''t sum to 1.')
end
if ~isempty(varargin),
switch varargin{1},
case 'js',
logQvect = log2((pVect2+pVect1)/2);
KL = .5 * (sum(pVect1.*(log2(pVect1)-logQvect)) + ...
sum(pVect2.*(log2(pVect2)-logQvect)));
case 'sym',
KL1 = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
KL2 = sum(pVect2 .* (log2(pVect2)-log2(pVect1)));
KL = (KL1+KL2)/2;

otherwise
error(['Last argument' ' "' varargin{1} '" ' 'not recognized.'])
end
else
KL = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
end

转载于:https://www.cnblogs.com/molakejin/p/5200193.html

paper 22:kl-divergence(KL散度)实现代码相关推荐

  1. KL Divergence KL散度

    在概率论或信息论中,KL散度( Kullback–Leibler divergence),又称相对熵(relative entropy),是描述两个概率分布P和Q差异的一种方法.它是非对称的,这意味着 ...

  2. python计算矩阵的散度_python 3计算KL散度(KL Divergence)

    KL Divergence KL( Kullback–Leibler) Divergence中文译作KL散度,从信息论角度来讲,这个指标就是信息增益(Information Gain)或相对熵(Rel ...

  3. 熵(Entropy),交叉熵(Cross-Entropy),KL-松散度(KL Divergence),似然(Likelihood)

    1.介绍: 我们如何去衡量y,y`的接近程度? 在这里我们介绍一下一种衡量方式交叉熵(Cross-Entropy),然后说明一下为什么这种方式适用于分类问题. 2.熵(Entropy): 熵的概念来自 ...

  4. 数学之美:信息的度量和作用 KL散度 自信息 熵 相对熵 KL divergence entropy

    当上述公式中概率相等时会推出,H刚好等于5比特. 自信息: 一条信息的信息量与该信息的不确定性有关.如果想要搞懂一件非常不清楚的事,就需要了解大量的信息,相反如果一件事我们已经了如指掌,那就不需要太多 ...

  5. KL Divergence

    参考文章: KL散度(Kullback-Leibler Divergence)介绍及详细公式推导 变分自编码器(VAE)推导 KL散度简介 KL散度的概念来源于概率论和信息论中.KL散度又被称为:相对 ...

  6. KL Divergence 与 JS Divergence

    1. 统计距离 (Statistical Distance) 很多时候我们需要对比两个概率分布: 我们有一个随机变量和它的两个不同的概率分布 (P 与 Q), 比如一个真实的分布和一个模型估计的分布. ...

  7. kl divergence matlab,直观理解-信息熵KL Divergence

    信息熵 简介 任何信息都存在冗余,冗余大小与信息中每个符号的出现概率或者说不确定性有关. 信息熵用于解决对信息的量化度量问题,描述信源的不确定度. 香农第一次用数学语言阐明了概率与信息冗余度的关系. ...

  8. pytorch中的kl divergence计算问题

    偶然从pytorch讨论论坛中看到的一个问题,KL divergence different results from tf,kl divergence 在TensorFlow中和pytorch中计算 ...

  9. KL divergence,JS divergence,Wasserstein distance是什么

    文章目录 前言 KL divergence JS divergence Wasserstein distance 总结 前言 这三个东西都可以用来两个分布的差异.其中三最难,其本身是来自另外的领域,如 ...

  10. KL divergence JS divergence F divergence

    KL散度: 衡量两个概率分布之间差异的指标 又称kl距离 ,相对熵 D ( P ∥ Q ) = ∑ P ( x ) log ⁡ P ( x ) Q ( x ) \mathrm{D}(\mathrm{P ...

最新文章

  1. 3.1.3 awk命令用法
  2. FFmpeg从入门到精通:SEI那些事
  3. SAP CRM中间件队列CSAPR_HIERR3MATCLASS
  4. 思考产品架构的4个视角:业务、场景、数据/功能、实现
  5. 基于Amarok的跨平台音乐播放器:Clementine mac版
  6. 征途手游2新开区服务器维护多久,《征途2手游》开启新服“星火燎原”
  7. ServicePointManager.ServerCertificateValidationCallback 冲突的解决
  8. 雨松MOMO《Unity 3D游戏开发》源码公布
  9. ups维护服务器,UPS电源的在线维护和管理
  10. MongoDB4.0.2集群搭建
  11. 查看宽带虚拟拨号PPPoE的上网口令密码
  12. matlab二重定积分_二重积分 matlab
  13. 2021年的学习Flag:只争朝夕,不负韶华
  14. 一元二次方程求解以及表达式
  15. 2021年焊工(初级)考试资料及焊工(初级)新版试题
  16. 【微信开发第二章】SpringBoot实现微信公众号普通消息和模板消息回复
  17. 设计分享|基于51单片机的万年历(汇编)
  18. java邮箱发送验证码_java实现使用QQ邮箱发送验证码功能
  19. MySQL入门学习的第一节(SQL语句)
  20. 微信扫描下载提示以及js判断用户手机系统

热门文章

  1. C++ STL之list具体解释
  2. 二分查找(递归和非递归实现)
  3. 研究生学习阶段时间安排
  4. JWT认证原理、整合springboot实战应用
  5. Idea / Eclipse中使用Lombok
  6. 1的准确率_库存准确率总是100%正常吗?
  7. C# WPF动态删除指定类型控件
  8. linux内核热修复,揭露内核黑科技 - 热补丁技术真容
  9. 触发父组件变量_Vue组件之间的传值
  10. oracle flex cluster,【Ora12c-GI】将Standard集群修改为Flex集群