Extended reality in the context of "Spatial computing"

Play Trivia Questions online!

or

Skip to study material about Extended reality in the context of "Spatial computing"




⭐ Core Definition: Extended reality

Extended reality (XR) is an umbrella term to "encompass and interpolate between the other realities", such as augmented reality (AR), virtual reality (VR), mixed/mediated reality (MR), and physical reality. See also The technology is intended to combine or mirror the physical world with a "digital twin world" able to interact with it, giving users an immersive experience by being in a virtual or augmented environment.

The first usage of the term "extended reality" was in reference to the use of technology to extrapolate (extend) beyond typical human perception, e.g. allowing us to see sound waves, radio waves, and otherwise invisible phenomena.

↓ Menu

👉 Extended reality in the context of Spatial computing

Spatial computing refers to 3D human–computer interaction techniques that are perceived by users as taking place in the real world, in and around their bodies and physical environments, instead of constrained to and perceptually behind computer screens or in purely virtual worlds. This concept inverts the long-standing practice of teaching people to interact with computers in digital environments, and instead teaches computers to better understand and interact with people more naturally in the human world. This concept overlaps with and encompasses others including extended reality, augmented reality, mixed reality, natural user interface, contextual computing, affective computing, and ubiquitous computing. The usage for labeling and discussing these adjacent technologies is imprecise.

Spatial computing devices include sensors—such as RGB cameras, depth cameras, 3D trackers, inertial measurement units, or other tools—to sense and track nearby human bodies (including hands, arms, eyes, legs, mouths) during ordinary interactions with people and computers in a 3D space. They further use computer vision to attempt to understand real world scenes, such as rooms, streets or stores, to read labels, to recognize objects, create 3D maps, and more. Quite often they also use extended reality and mixed reality to superimpose virtual 3D graphics and virtual 3D audio onto the human visual and auditory system as a way of providing information more naturally and contextually than traditional 2D screens.

↓ Explore More Topics
In this Dossier

Extended reality in the context of VisionOS

visionOS is a extended reality operating system derived primarily from iPadOS and its core frameworks (including UIKit, SwiftUI, ARKit and RealityKit), and MR-specific frameworks for foveated rendering and real-time interaction. It was developed by Apple exclusively for its Apple Vision Pro mixed reality headset. It was unveiled on June 5, 2023, at Apple's WWDC23 event alongside the reveal of the Apple Vision Pro. The software released on February 2, 2024, shipping with the Apple Vision Pro.

↑ Return to Menu

Extended reality in the context of Meta Horizon OS

Meta Horizon OS, previously known informally as Meta Quest Platform or Meta Quest OS, is an Android-based extended reality operating system for the Meta Quest line of devices released by Meta Platforms. Initially developed for the embedded operating system on the Oculus Rift and Oculus Rift S, the platform has been based on the Android operating system since the release of the Oculus Go in 2018. It first supported augmented reality via grayscale camera passthrough upon the release of the Oculus Quest in 2019, and has supported color passthrough since the release of the Meta Quest Pro in 2022.

On April 22, 2024, the company announced that the platform would be rebranded as Meta Horizon OS.

↑ Return to Menu

Extended reality in the context of 5G

5G is the fifth generation of cellular network technology and the successor to 4G. First deployed in 2019, its technical standards are developed by the 3rd Generation Partnership Project (3GPP) in cooperation with the ITU’s IMT-2020 program. 5G networks divide coverage areas into smaller zones called cells, enabling devices to connect to local base stations via radio. Each station connects to the broader telephone network and the Internet through high-speed optical fiber or wireless backhaul.

Compared to 4G, 5G offers significantly faster data transfer speed—up to 10 Gbit/s in tests—and lower latency, with response times of just a few milliseconds. These advancements allow networks to support more users and applications such as extended reality, autonomous vehicles, remote surgery trials, and fixed wireless access for home Internet access. 5G also supports massive connectivity for sensors and machines, commonly referred to as the Internet of things (IoT), and leverages edge computing to improve data processing efficiency.

↑ Return to Menu

Extended reality in the context of Godot (game engine)

Godot (/ˈɡɒd/ GOD-oh, /ɡəˈd/ gə-DOH, or /ˈɡdɒt/ GOH-dot) is a cross-platform, free and open-source game engine released under the permissive MIT license. It was initially developed in Buenos Aires by Argentine software developers Juan Linietsky and Ariel Manzur for several companies in Latin America prior to its public release in 2014. The development environment runs on many platforms, and can export to several more. It is designed to create both 2D and 3D games targeting PC, mobile, web, and virtual, augmented, and mixed reality platforms and can also be used to develop non-game software.

↑ Return to Menu