Harry Nyquist in the context of "Information theory"

Play Trivia Questions online!

or

Skip to study material about Harry Nyquist in the context of "Information theory"

Ad spacer

⭐ Core Definition: Harry Nyquist

Harry Theodor Nyquist (/ˈnkwɪst/, Swedish: [ˈnŷːkvɪst]; February 7, 1889 – April 4, 1976) was a Swedish-American physicist and electronic engineer who made important contributions to communication theory.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 Harry Nyquist in the context of Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

↓ Explore More Topics
In this Dossier

Harry Nyquist in the context of Noisy channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.

The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and Warren Weaver entitled The Mathematical Theory of Communication (1949). This founded the modern discipline of information theory.

↑ Return to Menu

Harry Nyquist in the context of Nyquist frequency

In signal processing, the Nyquist frequency (or folding frequency), named after Harry Nyquist, is a characteristic of a sampler, which converts a continuous function or signal into a discrete sequence. For a given sampling rate (samples per second), the Nyquist frequency (cycles per second) is the frequency whose cycle-length (or period) is twice the interval between samples, thus 0.5 cycle/sample. For example, audio CDs have a sampling rate of 44100 samples/second. At 0.5 cycle/sample, the corresponding Nyquist frequency is 22050 cycles/second (Hz). Conversely, the Nyquist rate for sampling a 22050 Hz signal is 44100 samples/second.

When the highest frequency (bandwidth) of a signal is less than the Nyquist frequency of the sampler, the resulting discrete-time sequence is said to be free of the distortion known as aliasing, and the corresponding sample rate is said to be above the Nyquist rate for that particular signal.

↑ Return to Menu