Word (computer architecture) in the context of "Data (computer science)"

Play Trivia Questions online!

or

Skip to study material about Word (computer architecture) in the context of "Data (computer science)"

Ad spacer

⭐ Core Definition: Word (computer architecture)

In computing, a word is any processor design's natural unit of data. A word is a fixed-sized datum handled as a unit by the instruction set or the hardware of the processor. The number of bits or digits in a word (the word size, word width, or word length) is an important characteristic of any specific processor design or computer architecture.

The size of a word is reflected in many aspects of a computer's structure and operation; the majority of the registers in a processor are usually word-sized and the largest datum that can be transferred to and from the working memory in a single operation is a word in many (not all) architectures. The largest possible address size, used to designate a location in memory, is typically a hardware word (here, "hardware word" means the full-sized natural word of the processor, as opposed to any other definition used).

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<
In this Dossier

Word (computer architecture) in the context of String (computer science)

In computer programming, a string is traditionally a sequence of characters, either as a literal constant or as some kind of variable. The latter may allow its elements to be mutated and the length changed, or it may be fixed (after creation). A string is often implemented as an array data structure of bytes (or words) that stores a sequence of elements, typically characters, using some character encoding. More general, string may also denote a sequence (or list) of data other than just characters.

Depending on the programming language and precise data type used, a variable declared to be a string may either cause storage in memory to be statically allocated for a predetermined maximum length or employ dynamic allocation to allow it to hold a variable number of elements.

↑ Return to Menu

Word (computer architecture) in the context of Digital signal (signal processing)

In the context of digital signal processing (DSP), a digital signal is a discrete time, quantized amplitude signal. In other words, it is a sampled signal consisting of samples that take on values from a discrete set (a countable set that can be mapped one-to-one to a subset of integers). If that discrete set is finite, the discrete values can be represented with digital words of a finite width. Most commonly, these discrete values are represented as fixed-point words (either proportional to the waveform values or companded) or floating-point words.

The process of analog-to-digital conversion produces a digital signal. The conversion process can be thought of as occurring in two steps:

↑ Return to Menu

Word (computer architecture) in the context of Byte

The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures. To disambiguate arbitrarily sized bytes from the common 8-bit definition, network protocol documents such as the Internet Protocol (RFC 791) refer to an 8-bit byte as an octet. Those bits in an octet are usually counted with numbering from 0 to 7 or 7 to 0 depending on the bit endianness.

The size of the byte has historically been hardware-dependent and no definitive standards existed that mandated the size. Sizes from 1 to 48 bits have been used. The six-bit character code was an often-used implementation in early encoding systems, and computers using six-bit and nine-bit bytes were common in the 1960s. These systems often had memory words of 12, 18, 24, 30, 36, 48, or 60 bits, corresponding to 2, 3, 4, 5, 6, 8, or 10 six-bit bytes, and persisted, in legacy systems, into the twenty-first century. In this era, bit groupings in the instruction stream were often referred to as syllables or slab, before the term byte became common.

↑ Return to Menu

Word (computer architecture) in the context of Bit numbering

In computing, bit numbering is the convention used to identify the bit positions in a binary number. The bits can be those in a memory byte or word, or those of an internal CPU register or data bus.

↑ Return to Menu