Ralph Hartley in the context of "Information-theoretic"

Play Trivia Questions online!

or

Skip to study material about Ralph Hartley in the context of "Information-theoretic"




⭐ Core Definition: Ralph Hartley

Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory.

His legacy includes the naming of the hartley, a unit of information equal to one decimal digit, after him.

↓ Menu

👉 Ralph Hartley in the context of Information-theoretic

Information theory is the mathematical study of the quantification, storage, and communication of a particular type of mathematically defined information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

As a simple example, if one flips a fair coin and does not know the outcome (heads or tails), then they lack a certain amount of information. If one looks at the coin, they will know the outcome and gain that same amount of information. For a fair coin, the probability of either heads or tails is 1/2 and that amount of information can be expressed as = 1 bit of information.

↓ Explore More Topics
In this Dossier

Ralph Hartley in the context of Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

↑ Return to Menu

Ralph Hartley in the context of Noisy channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.

The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and Warren Weaver entitled The Mathematical Theory of Communication (1949). This founded the modern discipline of information theory.

↑ Return to Menu

Ralph Hartley in the context of Hartley (unit)

The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is 110. It is therefore equal to the information contained in one decimal digit (or dit), assuming a priori equiprobability of each possible value. It is named after Ralph Hartley.

If base 2 logarithms and powers of 2 are used instead, then the unit of information is the shannon or bit, which is the information content of an event if the probability of that event occurring is 12. Natural logarithms and powers of e define the nat.

↑ Return to Menu

Ralph Hartley in the context of Hartley oscillator

The Hartley oscillator is an electronic oscillator circuit in which the oscillation frequency is determined by a tuned circuit consisting of capacitors and inductors, that is, an LC oscillator. The circuit was invented in 1915 by American engineer Ralph Hartley. The distinguishing feature of the Hartley oscillator is that the tuned circuit consists of a single capacitor in parallel with two inductors in series (or a single tapped inductor), and the feedback signal needed for oscillation is taken from the center connection of the two inductors.

↑ Return to Menu