kullbackLeiblerDivergence

Computes the Kullback-Leibler divergence between input ranges a and b, which is the sum ai * log(ai / bi). The base of logarithm is 2. The ranges are assumed to contain elements in [0, 1]. Usually the ranges are normalized probability distributions, but this is not required or checked by kullbackLeiblerDivergence. If any element bi is zero and the corresponding element ai nonzero, returns infinity. (Otherwise, if ai == 0 && bi == 0, the term ai * log(ai / bi) is considered zero.) If the inputs are normalized, the result is positive.

kullbackLeiblerDivergence
(
Range1
Range2
)
(
Range1 a
,
Range2 b
)
if (
isInputRange!(Range1) &&
isInputRange!(Range2)
)

Examples

import std.math.operations : isClose;

double[] p = [ 0.0, 0, 0, 1 ];
assert(kullbackLeiblerDivergence(p, p) == 0);
double[] p1 = [ 0.25, 0.25, 0.25, 0.25 ];
assert(kullbackLeiblerDivergence(p1, p1) == 0);
assert(kullbackLeiblerDivergence(p, p1) == 2);
assert(kullbackLeiblerDivergence(p1, p) == double.infinity);
double[] p2 = [ 0.2, 0.2, 0.2, 0.4 ];
assert(isClose(kullbackLeiblerDivergence(p1, p2), 0.0719281, 1e-5));
assert(isClose(kullbackLeiblerDivergence(p2, p1), 0.0780719, 1e-5));

Meta