In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how much an approximating probability distribution Q is different from a true probability distribution P. Mathematically, it is defined as