A Mathematical Theory of Communication in the context of "Entropy (information theory)"

Play Trivia Questions online!

or

Skip to study material about A Mathematical Theory of Communication in the context of "Entropy (information theory)"

Ad spacer

⭐ Core Definition: A Mathematical Theory of Communication

"A Mathematical Theory of Communication" is an article by mathematician Claude Shannon published in Bell System Technical Journal in 1948. It was renamed The Mathematical Theory of Communication in the 1949 book of the same name, a small but significant title change after realizing the generality of this work. It has tens of thousands of citations, being one of the most influential and cited scientific papers of all time, as it gave rise to the field of information theory, with Scientific American referring to the paper as the "Magna Carta of the Information Age", while the electrical engineer Robert G. Gallager called the paper a "blueprint for the digital era". Historian James Gleick rated the paper as the most important development of 1948, placing the transistor second in the same time period, with Gleick emphasizing that the paper by Shannon was "even more profound and more fundamental" than the transistor.

It is also noted that "as did relativity and quantum theory, information theory radically changed the way scientists look at the universe". The paper also formally introduced the term "bit" and serves as its theoretical foundation.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

👉 A Mathematical Theory of Communication in the context of Entropy (information theory)

In information theory, the entropy of a random variable quantifies the average level of uncertainty or information associated with the variable's potential states or possible outcomes. This measures the expected amount of information needed to describe the state of the variable, considering the distribution of probabilities across all potential states. Given a discrete random variable , which may be any member within the set and is distributed according to , the entropy iswhere denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys". An equivalent definition of entropy is the expected value of the self-information of a variable.

The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is also referred to as Shannon entropy. Shannon's theory defines a data communication system composed of three elements: a source of data, a communication channel, and a receiver. The "fundamental problem of communication" – as expressed by Shannon – is for the receiver to be able to identify what data was generated by the source, based on the signal it receives through the channel. Shannon considered various ways to encode, compress, and transmit messages from a data source, and proved in his source coding theorem that the entropy represents an absolute mathematical limit on how well data from the source can be losslessly compressed onto a perfectly noiseless channel. Shannon strengthened this result considerably for noisy channels in his noisy-channel coding theorem.

↓ Explore More Topics
In this Dossier

A Mathematical Theory of Communication in the context of Encoding/decoding model of communication

The encoding/decoding model of communication emerged in rough and general form in 1948 in Claude E. Shannon's "A Mathematical Theory of Communication," where it was part of a technical schema for designating the technological encoding of signals. Gradually, it was adapted by communications scholars, most notably Wilbur Schramm, in the 1950s, primarily to explain how mass communications could be effectively transmitted to a public, its meanings intact by the audience (i.e., decoders). As the jargon of Shannon's information theory moved into semiotics, notably through the work of thinkers Roman Jakobson, Roland Barthes, and Umberto Eco, who in the course of the 1960s began to put more emphasis on the social and political aspects of encoding. It became much more widely known, and popularised, when adapted by cultural studies scholar Stuart Hall in 1973, for a conference addressing mass communications scholars. In a Marxist twist on this model, Stuart Hall's study, titled 'Encoding and Decoding in the Television Discourse', offered a theoretical approach to how media messages are produced, disseminated, and interpreted. Hall proposed that audience members can play an active role in decoding messages as they rely on their own social contexts and capability of changing messages through collective action.

Thus, encoding/decoding is the translation needed for a message to be easily understood. When you decode a message, you extract the meaning of that message in ways to simplify it. Decoding has both verbal and non-verbal forms of communication: Decoding behavior without using words, such as displays of non-verbal communication. There are many examples, including observing body language and its associated emotions, e.g. monitoring signs when someone is upset, angry, or stressed where they use excessive hand/arm movements, crying, and even silence. Moreover, there are times when an individual can send a message across to someone, the message can be interpreted differently from person to person. Decoding is all about understanding others, based on the information given throughout the message being received. Whether there is a large audience or exchanging a message to one person, decoding is the process of obtaining, absorbing and sometimes utilizing information that was given throughout a verbal or non-verbal message.

↑ Return to Menu