IEC 80000-13 in the context of "Gigabyte"

Play Trivia Questions online!

or

Skip to study material about IEC 80000-13 in the context of "Gigabyte"

Ad spacer

>>>PUT SHARE BUTTONS HERE<<<

👉 IEC 80000-13 in the context of Gigabyte

The gigabyte (/ˈɡɪɡəbt, ˈɪɡəbt/) is a multiple of the unit byte for digital information. The prefix giga means 10 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB.

This definition is used in all contexts of science (especially data science), engineering, business, and many areas of computing, including storage capacities of hard drives, solid-state drives, and tapes, as well as data transmission speeds. The term is also used in some fields of computer science and information technology to denote 1073741824 (1024 or 2) bytes, however, particularly for sizes of RAM. Thus, some usage of gigabyte has been ambiguous. To resolve this difficulty, IEC 80000-13 clarifies that a gigabyte (GB) is 10 bytes and specifies the term gibibyte (GiB) to denote 2 bytes. These differences are still readily seen, for example, when a 400 GB drive's capacity is displayed by Microsoft Windows as 372 GB instead of 372 GiB. Analogously, a memory module that is labeled as having the size "1GB" has one gibibyte (1GiB) of storage capacity.

↓ Explore More Topics
In this Dossier

IEC 80000-13 in the context of Shannon (unit)

The shannon (symbol: Sh) is a unit of information named after Claude Shannon, the founder of information theory. IEC 80000-13 defines the shannon as the information content associated with an event when the probability of the event occurring is 1/2. It is understood as such within the realm of information theory, and is conceptually distinct from the bit, a term used in data processing and storage to denote a single instance of a binary signal. A sequence of n binary symbols (such as contained in computer memory or a binary data transmission) is properly described as consisting of n bits, but the information content of those n symbols may be more or less than n shannons depending on the a priori probability of the actual sequence of symbols.

The shannon also serves as a unit of the information entropy of an event, which is defined as the expected value of the information content of the event (i.e., the probability-weighted average of the information content of all potential events). Given a number of possible outcomes, unlike information content, the entropy has an upper bound, which is reached when the possible outcomes are equiprobable. The maximum entropy of n bits is n Sh. A further quantity that it is used for is channel capacity, which is generally the maximum of the expected value of the information content encoded over a channel that can be transferred with negligible probability of error, typically in the form of an information rate.

↑ Return to Menu