Harry Nyquist in the context of "Nyquist rate"

Play Trivia Questions online!

or

Skip to study material about Harry Nyquist in the context of "Nyquist rate"




⭐ Core Definition: Harry Nyquist

Harry Theodor Nyquist (/ˈnkwɪst/, Swedish: [ˈnŷːkvɪst]; February 7, 1889 – April 4, 1976) was a Swedish-American physicist and electronic engineer who made important contributions to communication theory.

↓ Menu

👉 Harry Nyquist in the context of Nyquist rate

In signal processing, the Nyquist rate, named after Harry Nyquist, is a value equal to twice the highest frequency (bandwidth) of a given function or signal. It has units of samples per unit time, conventionally expressed as samples per second, or hertz (Hz). When the signal is sampled at a higher sample rate (see § Critical frequency), the resulting discrete-time sequence is said to be free of the distortion known as aliasing. Conversely, for a given sample rate the corresponding Nyquist frequency is one-half the sample rate. Note that the Nyquist rate is a property of a continuous-time signal, whereas Nyquist frequency is a property of a discrete-time system.

The term Nyquist rate is also used in a different context with units of symbols per second, which is actually the field in which Harry Nyquist was working. In that context it is an upper bound for the symbol rate across a bandwidth-limited baseband channel such as a telegraph line or passband channel such as a limited radio frequency band or a frequency division multiplex channel.

↓ Explore More Topics
In this Dossier

Harry Nyquist in the context of Information theory

Information theory is the mathematical study of the quantification, storage, and communication of information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (which has two equally likely outcomes) provides less information (lower entropy, less uncertainty) than identifying the outcome from a roll of a die (which has six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy. Important sub-fields of information theory include source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

↑ Return to Menu

Harry Nyquist in the context of Noisy channel coding theorem

In information theory, the noisy-channel coding theorem (sometimes Shannon's theorem or Shannon's limit), establishes that for any given degree of noise contamination of a communication channel, it is possible (in theory) to communicate discrete data (digital information) nearly error-free up to a computable maximum rate through the channel. This result was presented by Claude Shannon in 1948 and was based in part on earlier work and ideas of Harry Nyquist and Ralph Hartley.

The Shannon limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and Warren Weaver entitled The Mathematical Theory of Communication (1949). This founded the modern discipline of information theory.

↑ Return to Menu

Harry Nyquist in the context of Nyquist frequency

In signal processing, the Nyquist frequency (or folding frequency), named after Harry Nyquist, is a characteristic of a sampler, which converts a continuous function or signal into a discrete sequence. For a given sampling rate (samples per second), the Nyquist frequency (cycles per second) is the frequency whose cycle-length (or period) is twice the interval between samples, thus 0.5 cycle/sample. For example, audio CDs have a sampling rate of 44100 samples/second. At 0.5 cycle/sample, the corresponding Nyquist frequency is 22050 cycles/second (Hz). Conversely, the Nyquist rate for sampling a 22050 Hz signal is 44100 samples/second.

When the highest frequency (bandwidth) of a signal is less than the Nyquist frequency of the sampler, the resulting discrete-time sequence is said to be free of the distortion known as aliasing, and the corresponding sample rate is said to be above the Nyquist rate for that particular signal.

↑ Return to Menu

Harry Nyquist in the context of Information-theoretic

Information theory is the mathematical study of the quantification, storage, and communication of a particular type of mathematically defined information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of Harry Nyquist and Ralph Hartley. It is at the intersection of electronic engineering, mathematics, statistics, computer science, neurobiology, physics, and electrical engineering.

As a simple example, if one flips a fair coin and does not know the outcome (heads or tails), then they lack a certain amount of information. If one looks at the coin, they will know the outcome and gain that same amount of information. For a fair coin, the probability of either heads or tails is 1/2 and that amount of information can be expressed as = 1 bit of information.

↑ Return to Menu

Harry Nyquist in the context of Fluctuation-dissipation theorem

The fluctuation–dissipation theorem (FDT) or fluctuation–dissipation relation (FDR) is a powerful tool in statistical physics for predicting the behavior of systems that obey detailed balance. Given that a system obeys detailed balance, the theorem is a proof that thermodynamic fluctuations in a physical variable predict the response quantified by the admittance or impedance (in their general sense, not only in electromagnetic terms) of the same physical variable (like voltage, temperature difference, etc.), and vice versa. The fluctuation–dissipation theorem applies both to classical and quantum mechanical systems.

The fluctuation–dissipation theorem was proven by Herbert Callen and Theodore Welton in 1951and expanded by Ryogo Kubo. There are antecedents to the general theorem, including Einstein's explanation of Brownian motionduring his annus mirabilis and Harry Nyquist's explanation in 1928 of Johnson noise in electrical resistors.

↑ Return to Menu