Interoperability in the context of "Standardization"

Play Trivia Questions online!

or

Skip to study material about Interoperability in the context of "Standardization"

Ad spacer

>>>PUT SHARE BUTTONS HERE<<<

👉 Interoperability in the context of Standardization

Standardization (American English) or standardisation (British English) is the process of implementing and developing technical standards based on the consensus of different parties that include firms, users, interest groups, standards organizations and governments. Standardization can help maximize compatibility, interoperability, safety, repeatability, efficiency, and quality. It can also facilitate a normalization of formerly custom processes.

In social sciences, including economics, the idea of standardization is close to the solution for a coordination problem, a situation in which all parties can realize mutual gains, but only by making mutually consistent decisions. Divergent national standards impose costs on consumers and can be a form of non-tariff trade barrier.

↓ Explore More Topics
In this Dossier

Interoperability in the context of Digital library

A digital library (also called an online library, an internet library, a digital repository, a library without walls, or a digital collection) is an online database of digital resources that can include text, still images, audio, video, digital documents, or other digital media formats or a library accessible through the internet. Objects can consist of digitized content like print or photographs, as well as originally produced digital content like word processor files or social media posts. In addition to storing content, digital libraries provide means for organizing, searching, and retrieving the content contained in the collection. Digital libraries can vary immensely in size and scope, and can be maintained by individuals or organizations. The digital content may be stored locally, or accessed remotely via computer networks. These information retrieval systems are able to exchange information with each other through interoperability and sustainability.

↑ Return to Menu

Interoperability in the context of Interoperation

In engineering, interoperation is the setup of ad hoc components and methods to make two or more systems work together as a combined system with some partial functionality during a certain time, possibly requiring human supervision to perform necessary adjustments and corrections.

This contrasts to interoperability, which theoretically permits any number of systems compliant to a given standard to work together a long time smoothly and unattended as a combined system with the full functionality by the standard.

↑ Return to Menu

Interoperability in the context of Trans-European Networks

The Trans-European Networks (TEN) were created by the European Union by Articles 154–156 of the Treaty of Rome (1957), with the stated goals of the creation of an internal market and the reinforcement of economic and social cohesion. To various supporters of this policy, it made little sense to talk of a big EU market, with freedom of movement within it for goods, persons and services, unless the various regions and national networks making up that market were properly linked by modern and efficient infrastructure. The construction of Trans-European Networks was also seen as an important element for economic growth and the creation of employment.

The Treaty Establishing the European Community first provided a legal basis for the TENs. Under the terms of Chapter XV of the Treaty (Articles 154, 155 and 156), the European Union must aim to promote the development of Trans-European Networks as a key element for the creation of the Internal Market and the reinforcement of Economic and Social Cohesion. This development includes the interconnection and interoperability of national networks as well as access to such networks.

↑ Return to Menu

Interoperability in the context of DLNA

Digital Living Network Alliance (DLNA) is a set of interoperability standards for sharing home digital media among multimedia devices. It allows users to share or stream stored media files to various certified devices on the same network like PCs, smartphones, TV sets, game consoles, stereo systems, and NASs. DLNA incorporates several existing public standards, including Universal Plug and Play (UPnP) for media management and device discovery and control, wired and wireless networking standards, and widely used digital media formats. Many routers and network attached storage (NAS) devices have built-in DLNA support, as well as software applications like Windows Media Player.

DLNA was created by Sony and Intel and the consortium soon included various PC and consumer electronics companies, publishing its first set of guidelines in June 2004. The Digital Living Network Alliance developed and promoted it under the auspices of a certification standard, with a claimed membership of "more than 200 companies" before dissolving in 2017. By September 2014 over 25,000 device models had obtained "DLNA Certified" status, indicated by a logo on their packaging and confirming their interoperability with other devices.

↑ Return to Menu

Interoperability in the context of Computer standard

Computer hardware and software standards are technical standards instituted for compatibility and interoperability between software, systems, platforms and devices.

↑ Return to Menu

Interoperability in the context of Ontology (information science)

In information science, an ontology encompasses a representation, formal naming, and definitions of the categories, properties, and relations between the concepts, data, or entities that pertain to one, many, or all domains of discourse. More simply, an ontology is a way of showing the properties of a subject area and how they are related, by defining a set of terms and relational expressions that represent the entities in that subject area. The field which studies ontologies so conceived is sometimes referred to as applied ontology.

Every academic discipline or field, in creating its terminology, thereby lays the groundwork for an ontology. Each uses ontological assumptions to frame explicit theories, research and applications. Improved ontologies may improve problem solving within that domain, interoperability of data systems, and discoverability of data. Translating research papers within every field is a problem made easier when experts from different countries maintain a controlled vocabulary of jargon between each of their languages. For instance, the definition and ontology of economics is a primary concern in Marxist economics, but also in other subfields of economics. An example of economics relying on information science occurs in cases where a simulation or model is intended to enable economic decisions, such as determining what capital assets are at risk and by how much (see risk management).

↑ Return to Menu

Interoperability in the context of Abstraction layer

In computing, an abstraction layer or abstraction level is a way of hiding the working details of a subsystem. Examples of software models that use layers of abstraction include the OSI model for network protocols, OpenGL, and other graphics libraries, which allow the separation of concerns to facilitate interoperability and platform independence.

In computer science, an abstraction layer is a generalization of a conceptual model or algorithm, away from any specific implementation. These generalizations arise from broad similarities that are best encapsulated by models that express similarities present in various specific implementations. The simplification provided by a good abstraction layer allows for easy reuse by distilling a useful concept or design pattern so that situations, where it may be accurately applied, can be quickly recognized. Just composing lower-level elements into a construct doesn't count as an abstraction layer unless it shields users from its underlying complexity.

↑ Return to Menu

Interoperability in the context of Web 2.0

Web 2.0 (also known as participative (or participatory) web and social web) refers to websites that emphasize user-generated content, ease of use, participatory culture, and interoperability (i.e., compatibility with other products, systems, and devices) for end users.

The term was coined by Darcy DiNucci in 1999 and later popularized by Tim O'Reilly and Dale Dougherty at the first Web 2.0 Conference in 2004. Although the term mimics the numbering of software versions, it does not denote a formal change in the nature of the World Wide Web; the term merely describes a general change that occurred during this period as interactive websites proliferated and came to overshadow the older, more static websites of the original Web.

↑ Return to Menu