Instruction set architecture in the context of "IBM System/360 architecture"

⭐ In the context of IBM System/360 architecture, the instruction set architecture is considered…

Ad spacer

⭐ Core Definition: Instruction set architecture

An instruction set architecture (ISA) is an abstract model that defines the programmable interface of the CPU of a computer; how software can control a computer. A device (i.e. CPU) that interprets instructions described by an ISA is an implementation of that ISA. Generally, the same ISA is used for a family of related CPU devices.

In general, an ISA defines the instructions, data types, registers, and the programming interface for managing main memory such as addressing modes, virtual memory, and memory consistency mechanisms. The ISA also includes the input/output model of the programmable interface.

↓ Menu

>>>PUT SHARE BUTTONS HERE<<<

šŸ‘‰ Instruction set architecture in the context of IBM System/360 architecture

The IBM System/360 architecture is the model independent architecture for the entire S/360 line of mainframe computers, including but not limited to the instruction set architecture. The elements of the architecture are documented in the IBM System/360 Principles of Operation and the IBM System/360 I/O Interface Channel to Control Unit Original Equipment Manufacturers' Information manuals.

↓ Explore More Topics
In this Dossier

Instruction set architecture in the context of Computer architecture

In computer science and computer engineering, a computer architecture is the structure of a computer system made from component parts. It can sometimes be a high-level description that ignores details of the implementation. At a more detailed level, the description may include the instruction set architecture design, microarchitecture design, logic design, and implementation.

↑ Return to Menu

Instruction set architecture in the context of MacOS

macOS (previously OSĀ X and originally MacĀ OSĀ X) is a proprietary Unix-based operating system, derived from OPENSTEP for Mach and FreeBSD, which has been marketed and developed by Apple since 2001. It is the current operating system for Apple's Mac computers. Within the market of desktop and laptop computers, it is the second most widely used desktop OS, after Microsoft Windows and ahead of all Linux distributions, including ChromeOS and SteamOS. As of 2025, the most recent release of macOS is macOS 26 Tahoe, the 22nd major version of macOS.

Mac OS X succeeded the classic Mac OS, the primary Macintosh operating system from 1984 to 2001. Its underlying architecture came from NeXT's NeXTSTEP, as a result of Apple's acquisition of NeXT, which also brought Steve Jobs back to Apple. The first desktop version, Mac OS X 10.0, was released on March 24, 2001. Mac OS X Leopard and all later versions of macOS, other than OS X Lion, are UNIX 03 certified. Each of Apple's other contemporary operating systems, including iOS, iPadOS, watchOS, tvOS, audioOS and visionOS, are derivatives of macOS. Throughout its history, macOS has supported three major processor architectures: the initial version supported PowerPC-based Macs only, with support for Intel-based Macs beginning with OS X Tiger 10.4.4 and support for ARM-based Apple silicon Macs beginning with macOS Big Sur. Support for PowerPC-based Macs was dropped with OS X Snow Leopard, and it was announced at the 2025 Worldwide Developers Conference that macOS Tahoe will be the last to support Intel-based Macs.

↑ Return to Menu

Instruction set architecture in the context of C (programming language)

C is a general-purpose programming language. It was created in the 1970s by Dennis Ritchie and remains widely used and influential. By design, C gives the programmer relatively direct access to the features of the typical CPU architecture, customized for the target instruction set. It has been and continues to be used to implement operating systems (especially kernels), device drivers, and protocol stacks, but its use in application software has been decreasing. C is used on computers that range from the largest supercomputers to the smallest microcontrollers and embedded systems.

A successor to the programming language B, C was originally developed at Bell Labs by Ritchie between 1972 and 1973 to construct utilities running on Unix. It was applied to re-implementing the kernel of the Unix operating system. During the 1980s, C gradually gained popularity. It has become one of the most widely used programming languages, with C compilers available for practically all modern computer architectures and operating systems. The book The C Programming Language, co-authored by the original language designer, served for many years as the de facto standard for the language. C has been standardized since 1989 by the American National Standards Institute (ANSI) and, subsequently, jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).

↑ Return to Menu

Instruction set architecture in the context of Microarchitecture

In electronics, computer science and computer engineering, microarchitecture, also called computer organization and sometimes abbreviated as μarch or uarch, is the way a given instruction set architecture (ISA) is implemented in a particular processor. A given ISA may be implemented with different microarchitectures; implementations may vary due to different goals of a given design or due to shifts in technology.

Computer architecture is the combination of microarchitecture and instruction set architecture.

↑ Return to Menu

Instruction set architecture in the context of Assembly language

In computing, assembly language (alternatively assembler language or symbolic machine code), often referred to simply as assembly and commonly abbreviated as ASM or asm, is any low-level programming language with a very strong correspondence between the instructions in the language and the architecture's machine code instructions. Assembly language usually has one statement per machine code instruction (1:1), but constants, comments, assembler directives, symbolic labels of, e.g., memory locations, registers, and macros are generally also supported.

The first assembly code in which a language is used to represent machine code instructions is found in Kathleen and Andrew Donald Booth's 1947 work, Coding for A.R.C.. Assembly code is converted into executable machine code by a utility program referred to as an assembler. The term "assembler" is generally attributed to Wilkes, Wheeler and Gill in their 1951 book The Preparation of Programs for an Electronic Digital Computer, who, however, used the term to mean "a program that assembles another program consisting of several sections into a single program". The conversion process is referred to as assembly, as in assembling the source code. The computational step when an assembler is processing a program is called assembly time.

↑ Return to Menu

Instruction set architecture in the context of Low-level programming language

A low-level programming language is a programming language that provides little or no abstraction from a computer's instruction set architecture, memory or underlying physical hardware; commands or functions in the language are structurally similar to a processor's instructions. These languages provide the programmer with full control over program memory and the underlying machine code instructions. Because of the low level of abstraction (hence the term "low-level") between the language and machine language, low-level languages are sometimes described as being "close to the hardware".

↑ Return to Menu