Information theory is the mathematical study of the quantification, storage, and communication of a particular type of mathematically defined information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.
As a simple example, if one flips a fair coin and does not know the outcome (heads or tails), then they lack a certain amount of information. If one looks at the coin, they will know the outcome and gain that same amount of information. For a fair coin, the probability of either heads or tails is 1/2 and that amount of information can be expressed as = 1 bit of information.
