Binary logarithm in the context of "Hartley (unit)"

Play Trivia Questions online!

or

Skip to study material about Binary logarithm in the context of "Hartley (unit)"

Ad spacer

>>>PUT SHARE BUTTONS HERE<<<

👉 Binary logarithm in the context of Hartley (unit)

The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is 110. It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.

If base 2 logarithms and powers of 2 are used instead, then the unit of information is the shannon or bit, which is the information content of an event if the probability of that event occurring is 12. Natural logarithms and powers of e define the nat.

↓ Explore More Topics
In this Dossier

Binary logarithm in the context of Logarithm

In mathematics, the logarithm of a number is the exponent by which another fixed value, the base, must be raised to produce that number. For example, the logarithm of 1000 to base 10 is 3, because 1000 is 10 to the 3rd power: 1000 = 10 = 10 × 10 × 10. More generally, if x = b, then y is the logarithm of x to base b, written logb x, so log10 1000 = 3. As a single-variable function, the logarithm to base b is the inverse of exponentiation with base b.

The logarithm base 10 is called the decimal or common logarithm and is commonly used in science and engineering. The natural logarithm has the number e ≈ 2.718 as its base; its use is widespread in mathematics and physics because of its very simple derivative. The binary logarithm uses base 2 and is widely used in computer science, information theory, music theory, and photography. When the base is unambiguous from the context or irrelevant it is often omitted, and the logarithm is written log x.

↑ Return to Menu

Binary logarithm in the context of Nat (unit)

The natural unit of information (symbol: nat), sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.

One nat is equal to 1/ln 2 shannons ≈ 1.44 Sh or, equivalently, 1/ln 10 hartleys ≈ 0.434 Hart.

↑ Return to Menu