Jensen-Shannon Divergence

The Jensen-Shannon divergence is a principled divergence measure which is always finite.

jensen_shannon_divergence(dists, weights=None)[source]

The Jensen-Shannon Divergence: H(sum(w_i*P_i)) - sum(w_i*H(P_i)).

Parameters :
  • dists ([Distribution]) – The distributions, P_i, to take the Jensen-Shannon Divergence of.
  • weights ([float], None) – The weights, w_i, to give the distributions. If None, the weights are assumed to be uniform.
Returns:

jsd (float) – The Jensen-Shannon Divergence

Raises :
  • ditException – Raised if there dists and weights have unequal lengths.
  • InvalidNormalization – Raised if the weights do not sum to unity.
  • InvalidProbability – Raised if the weights are not valid probabilities.
Read the Docs v: dev
Versions
latest
dev
Downloads
PDF
HTML
Epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.