Jensen-Shannon divergence
sdriscoll
http://www.igorexchange.com/node/2318
//*****************************************************************************
// JSDivergence
// Created 20110703
//
// Shawn Driscoll
// Salk Institute for Biological Studies
//
// Calculates the Jensen-Shannon divergence between the probability
// distributions P and Q. P and Q must span the same range of values
// and contain the same number of points.
//
// Jensen-Shannon Divergence is defined as
//
// JSD(P||Q) = 1/2*D(P||M) + 1/2*D(Q||M)
//
// where M is 1/2*(P+Q) and D(P||M) is the Kullback-Leibler divergence.
//
// According to Lin (1991), the Jensen–Shannon divergence is bounded by 1.0.
//
// 0 <= JSD(P||Q) <= 1.0
//
//*****************************************************************************
function JSDivergence(P,Q)
//--------------------------------------------------------------------------
// parameters
//--------------------------------------------------------------------------
wave P,Q;
//--------------------------------------------------------------------------
// variables
//--------------------------------------------------------------------------
variable n_jsd = 0;
//--------------------------------------------------------------------------
// main function
//--------------------------------------------------------------------------
////
// create mean distribution
matrixop/free M = (P+Q)/2;
////
// calculate squared jensen-shannon divergence
n_jsd = 0.5*KLDivergence(P,M) + 0.5*KLDivergence(Q,M);
////
// take square root since this is the actual metric
n_jsd = sqrt(n_jsd);
//
// save to global variable
variable/g v_JSDivergence = n_jsd;
//
// print result
printf "v_JSDivergence= %g\r",n_jsd;
//
// return result
return n_jsd;
end
// JSDivergence
// Created 20110703
//
// Shawn Driscoll
// Salk Institute for Biological Studies
//
// Calculates the Jensen-Shannon divergence between the probability
// distributions P and Q. P and Q must span the same range of values
// and contain the same number of points.
//
// Jensen-Shannon Divergence is defined as
//
// JSD(P||Q) = 1/2*D(P||M) + 1/2*D(Q||M)
//
// where M is 1/2*(P+Q) and D(P||M) is the Kullback-Leibler divergence.
//
// According to Lin (1991), the Jensen–Shannon divergence is bounded by 1.0.
//
// 0 <= JSD(P||Q) <= 1.0
//
//*****************************************************************************
function JSDivergence(P,Q)
//--------------------------------------------------------------------------
// parameters
//--------------------------------------------------------------------------
wave P,Q;
//--------------------------------------------------------------------------
// variables
//--------------------------------------------------------------------------
variable n_jsd = 0;
//--------------------------------------------------------------------------
// main function
//--------------------------------------------------------------------------
////
// create mean distribution
matrixop/free M = (P+Q)/2;
////
// calculate squared jensen-shannon divergence
n_jsd = 0.5*KLDivergence(P,M) + 0.5*KLDivergence(Q,M);
////
// take square root since this is the actual metric
n_jsd = sqrt(n_jsd);
//
// save to global variable
variable/g v_JSDivergence = n_jsd;
//
// print result
printf "v_JSDivergence= %g\r",n_jsd;
//
// return result
return n_jsd;
end
Forum
Support
Gallery
Igor Pro 9
Learn More
Igor XOP Toolkit
Learn More
Igor NIDAQ Tools MX
Learn More