Shannon (unit) in the context of "Nat (unit)"

Play Trivia Questions online!

or

Skip to study material about Shannon (unit) in the context of "Nat (unit)"

Ad spacer

⭐ Core Definition: Shannon (unit)

The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory. IEC 80000-13 defines the shannon as the information content associated with an event when the probability of the event occurring is 1/2. It is understood as such within the realm of information theory, and is conceptually distinct from the bit, a term used in data processing and storage to denote a single instance of a binary signal. A sequence of n binary symbols (such as contained in computer memory or a binary data transmission) is properly described as consisting of n bits, but the information content of those n symbols may be more or less than n shannons depending on the a priori probability of the actual sequence of symbols.

The shannon also serves as a unit of the information entropy of an event, which is defined as the expected value of the information content of the event (i.e., the probability-weighted average of the information content of all potential events). Given a number of possible outcomes, unlike information content, the entropy has an upper bound, which is reached when the possible outcomes are equiprobable. The maximum entropy of n bits is n Sh. A further quantity that it is used for is channel capacity, which is generally the maximum of the expected value of the information content encoded over a channel that can be transferred with negligible probability of error, typically in the form of an information rate.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Shannon (unit) in the context of Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.

One nat is equal to 1/ln 2 shannons ≈ 1.44 Sh or, equivalently, 1/ln 10 hartleys ≈ 0.434 Hart.

↓ Explore More Topics
In this Dossier

Shannon (unit) in the context of Entropy (information theory)

In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable's potential states or possible outcomes. This measures the expected amount of information needed to describe the state of the variable, considering the distribution of probabilities across all potential states. Given a discrete random variable , which may be any member within the set and is distributed according to , the entropy iswhere denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable.

The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is also referred to as Shannon entropy. Shannon's theory defines a data communication system composed of three elements: a source of data, a communication channel, and a receiver. The "fundamental problem of communication" – as expressed by Shannon – is for the receiver to be able to identify what data was generated by the source, based on the signal it receives through the channel. Shannon considered various ways to encode, compress, and transmit messages from a data source, and proved in his source coding theorem that the entropy represents an absolute mathematical limit on how well data from the source can be losslessly compressed onto a perfectly noiseless channel. Shannon strengthened this result considerably for noisy channels in his noisy-channel coding theorem.

↑ Return to Menu

Shannon (unit) in the context of Hartley (unit)

The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is 110. It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.

If base 2 logarithms and powers of 2 are used instead, then the unit of information is the shannon or bit, which is the information content of an event if the probability of that event occurring is 12. Natural logarithms and powers of e define the nat.

↑ Return to Menu

Shannon (unit) in the context of Mutual information

In probability theory and information theory, the mutual information (MI) of two random variables is a measure of the mutual dependence between the two variables. More specifically, it quantifies the "amount of information" (in units such as shannons (bits), nats or hartleys) obtained about one random variable by observing the other random variable. The concept of mutual information is intimately linked to that of entropy of a random variable, a fundamental notion in information theory that quantifies the expected "amount of information" held in a random variable.

Not limited to real-valued random variables and linear dependence like the correlation coefficient, MI is more general and determines how different the joint distribution of the pair is from the product of the marginal distributions of and . MI is the expected value of the pointwise mutual information (PMI).

↑ Return to Menu