This standard defines a framework for mixed reality content aimed at effective motion learning, including terms and definitions, requirements and data formats. Mechanisms to synchronize the motion sensor and projector coordinate system are defined. Motion acquisition methods, application programming interfaces and user interfaces are described.
This standard specifies the requirements and test methods for the motion to photon (MTP) latency that causes virtual reality (VR) sickness while users are using the virtual reality content. This standard is applicable to VR content related with software, hardware, and human factors regarding MTP latency.
This standard defines an architecture required to implement a virtual reality system that can simulate responses to possible disasters in physical spaces, where users can actually move around with six degrees of freedom, for training. This reference architecture includes the physical-to-virtual component that transfers sensor data in the physical space to the virtual world, the virtual-to-virtual component that conveys the data between virtual world objects, and the virtual-to-physical component that transfers the simulated responses in the virtual world to actuators in the physical world.
This standard specifies the general technical framework, components, integration, and main business processes of augmented reality systems applied to mobile devices, and defines its technical requirements, including functional requirements, performance requirements, safety requirements and corresponding test methods. This standard is applicable to the design, development, and management of augmented reality enabled applications or features of applications on mobile devices.
This standard defines a framework for the Tactile Internet, including descriptions of various application scenarios, definitions and terminology, functions, and technical assumptions. This framework prominently also includes a reference model and architecture, which defines common architectural entities, interfaces between those entities, and the mapping of functions to those entities. The Tactile Internet encompasses mission critical applications (e.g., manufacturing, transportation, healthcare and mobility), as well as non-critical applications (e.g., edutainment and events).
Measuring quality of experience (QoE) aims to explore the factors that contribute to a user's perceptual experience including human, system, and context factors. Since QoE stems from human interaction with various devices, the estimation should be started by investigating the mechanism of human visual perception. Therefore, measuring QoE is still a challenging task. In this standard, QoE assessment is categorized into two subcategories which are perceptual quality and virtual reality (VR) cybersickness. In addition, deep learning models considering human factors for various QoE assessments are covered, along with a reliable subjective test methodology and a database construction procedure.
Augmented Reality (AR) promises to provide significant boosts in operational efficiency by making information available to employees needing task support in context in real time. To support according implementations of AR training systems, this document proposes an overarching integrated conceptual model that describes interactions between the physical world, the user, and digital information, the context for AR-assisted learning and other parameters of the environment. It defines two data models and their binding to XML and JSON for representing learning activities (also known as employee tasks and procedures) and the learning environment in which these tasks are performed (also known as the workplace). The interoperability specification and standard is presented in support of an open market where interchangeable component products provide alternatives to monolithic Augmented Reality-assisted learning systems. Moreover, it facilitates the creation of experience repositories and online marketplaces for Augmented Reality-enabled learning content. Specific attention was given to reuse and repurposing of existing learning content and catering to ‘mixed' experiences combining real world learner guidance with the consumption (or production) of traditional contents such as instructional video material or learning apps and widgets.
This document specifies requirements for the preparation, revision and presentation of digital product definition data, hereafter referred to as data sets, complementing existing standards. It supports two methods of application: 3D model-only and 3D model with 2D drawing in digital format. The structure of this document presents requirements common to both methods followed by clauses providing for any essential, differing requirements for each method. Additionally, its use in conjunction with computer-aided design (CAD) systems can assist in the progression towards improved modelling and annotation practices for CAD and engineering disciplines, as well as serving as a guideline for CAx software developers.
The present document provides a study on the QoE metrics relevant to VR service. The study focuses on:1. Defining a device reference model for VR QoE measurement points.2. Studying key performance indicators that may impact the experience of VR service.3. Identifying the existing QoE parameters and metrics defined in SA4 standards such as TS 26.247, TS 26.114 which are relevant to Virtual Reality user experience;4. Identifying and defining new QoE parameters and metrics relevant to Virtual Reality user experience, taking into consideration the use cases listed in TR 26.918, and any sources that show the relevance of new metrics, e.g. scientific literature, specifications/solutions from other standard organizations.5. Analysing potential improvements to the existing QoE reporting so as to better accommodate VR services.6. Providing recommendations to future standards work in SA4 on the QoE parameters and metrics and, as necessary, coordinate with other 3GPP groups and external SDOs, e.g. MPEG, ITU-T.
The scope of the present document is to investigate the relevance of Virtual Reality in the context of 3GPP. Virtual Reality is the ability to be virtually present in a non-physical world created by the rendering of natural and/or synthetic image and sound correlated by the movements of the immersed user allowing to interact with that world. With recent progress made in rendering devices, such as Head mounted displays (HMD), a significant quality of experience can be offered. By collecting comprehensive information on VR use cases, existing technologies and subjective quality, the report attempts to identify potential gaps and relevant interoperability points that may require further work and potential standardization in 3GPP in order to support VR use cases and experiences in 3GPP user services. The report primarily focuses on 360 degrees video and associated audio, which support three degrees of freedom (3DOF).
The present document defines the supported media formats, codecs, processing functions and guaranteed minimum performances per AR device category. The present document addresses the interoperability gaps identified in the conclusions of TS 26.998.