Euler's number in the context of "Leonhard Euler"

Play Trivia Questions online!

or

Skip to study material about Euler's number in the context of "Leonhard Euler"

Ad spacer

⭐ Core Definition: Euler's number

The number e is a mathematical constant, approximately equal to 2.71828, that is the base of the natural logarithm and exponential function. It is sometimes called Euler's number, after the Swiss mathematician Leonhard Euler, though this can invite confusion with Euler numbers, or with Euler's constant, a different constant typically denoted . Alternatively, e can be called Napier's constant after John Napier. The Swiss mathematician Jacob Bernoulli discovered the constant while studying compound interest.

The number e is of great importance in mathematics, alongside 0, 1, π, and i. All five appear in one formulation of Euler's identity and play important and recurring roles across mathematics. Like the constant π, e is irrational, meaning that it cannot be represented as a ratio of integers. Moreover, it is transcendental, meaning that it is not a root of any non-zero polynomial with rational coefficients. To 30 decimal places, the value of e is:

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<
In this Dossier

Euler's number in the context of Entropy (information theory)

In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable's potential states or possible outcomes. This measures the expected amount of information needed to describe the state of the variable, considering the distribution of probabilities across all potential states. Given a discrete random variable , which may be any member within the set and is distributed according to , the entropy iswhere denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable.

The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is also referred to as Shannon entropy. Shannon's theory defines a data communication system composed of three elements: a source of data, a communication channel, and a receiver. The "fundamental problem of communication" – as expressed by Shannon – is for the receiver to be able to identify what data was generated by the source, based on the signal it receives through the channel. Shannon considered various ways to encode, compress, and transmit messages from a data source, and proved in his source coding theorem that the entropy represents an absolute mathematical limit on how well data from the source can be losslessly compressed onto a perfectly noiseless channel. Shannon strengthened this result considerably for noisy channels in his noisy-channel coding theorem.

↑ Return to Menu

Euler's number in the context of Euler

Leonhard Euler (/ˈɔɪlər/ OY-lər; 15 April 1707 – 18 September 1783) was a Swiss polymath who was active as a mathematician, physicist, astronomer, logician, geographer, and engineer. He founded the studies of graph theory and topology and made influential discoveries in many other branches of mathematics, such as analytic number theory, complex analysis, and infinitesimal calculus. He also introduced much of modern mathematical terminology and notation, including the notion of a mathematical function. He is known for his work in mechanics, fluid dynamics, optics, astronomy, and music theory. Euler has been called a "universal genius" who "was fully equipped with almost unlimited powers of imagination, intellectual gifts and extraordinary memory". He spent most of his adult life in Saint Petersburg, Russia, and in Berlin, then the capital of Prussia.

Euler is credited for popularizing the Greek letter (lowercase pi) to denote the ratio of a circle's circumference to its diameter, as well as first using the notation for the value of a function, the letter to express the imaginary unit , the Greek letter (capital sigma) to express summations, the Greek letter (capital delta) for finite differences, and lowercase letters to represent the sides of a triangle while representing the angles as capital letters. He gave the current definition of the constant , the base of the natural logarithm, now known as Euler's number. Euler made contributions to applied mathematics and engineering, such as his study of ships, which helped navigation; his three volumes on optics, which contributed to the design of microscopes and telescopes; and his studies of beam bending and column critical loads.

↑ Return to Menu