The JensenShannon divergence is a measurement of the distance between two probability distributions. It has very broad applications, from bioinformatics (genome comparison) to social sciences and machine learning tasks.
In probability theory and statistics, the Jensen–Shannon divergence is a method of measuring the similarity between two probability distributions.
The square root of the Jensen–Shannon divergence is a metric often referred to as Jensen–Shannon distance.

Keywords: Statistics