Relative entropy in the context of "Information theory"

Play Trivia Questions online!

or

Skip to study material about Relative entropy in the context of "Information theory"

Ad spacer

⭐ Core Definition: Relative entropy

In mathematical statistics, the Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how much an approximating probability distribution Q is different from a true probability distribution P. Mathematically, it is defined as

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Relative entropy in the context of Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

↓ Explore More Topics
In this Dossier