Measuring quality of experience (QoE) aims to explore the factors that contribute to a user's perceptual experience including human, system, and context factors. Since QoE stems from human interaction with various devices, the estimation should be started by investigating the mechanism of human visual perception. Therefore, measuring QoE is still a challenging task. In this standard, QoE assessment is categorized into two subcategories which are perceptual quality and virtual reality (VR) cybersickness. In addition, deep learning models considering human factors for various QoE assessments are covered, along with a reliable subjective test methodology and a database construction procedure.
This standard defines a framework for the Tactile Internet, including descriptions of various application scenarios, definitions and terminology, functions, and technical assumptions. This framework prominently also includes a reference model and architecture, which defines common architectural entities, interfaces between those entities, and the mapping of functions to those entities. The Tactile Internet encompasses mission critical applications (e.g., manufacturing, transportation, healthcare and mobility), as well as non-critical applications (e.g., edutainment and events).
This standard specifies the general technical framework, components, integration, and main business processes of augmented reality systems applied to mobile devices, and defines its technical requirements, including functional requirements, performance requirements, safety requirements and corresponding test methods. This standard is applicable to the design, development, and management of augmented reality enabled applications or features of applications on mobile devices.
This standard defines an architecture required to implement a virtual reality system that can simulate responses to possible disasters in physical spaces, where users can actually move around with six degrees of freedom, for training. This reference architecture includes the physical-to-virtual component that transfers sensor data in the physical space to the virtual world, the virtual-to-virtual component that conveys the data between virtual world objects, and the virtual-to-physical component that transfers the simulated responses in the virtual world to actuators in the physical world.
The scope of the present document is to investigate the relevance of Virtual Reality in the context of 3GPP. Virtual Reality is the ability to be virtually present in a non-physical world created by the rendering of natural and/or synthetic image and sound correlated by the movements of the immersed user allowing to interact with that world. With recent progress made in rendering devices, such as Head mounted displays (HMD), a significant quality of experience can be offered. By collecting comprehensive information on VR use cases, existing technologies and subjective quality, the report attempts to identify potential gaps and relevant interoperability points that may require further work and potential standardization in 3GPP in order to support VR use cases and experiences in 3GPP user services. The report primarily focuses on 360 degrees video and associated audio, which support three degrees of freedom (3DOF).
The present document defines the supported media formats, codecs, processing functions and guaranteed minimum performances per AR device category. The present document addresses the interoperability gaps identified in the conclusions of TS 26.998.
The present document defines interoperable formats for Virtual Reality for streaming services. Specifically, the present document defines operation points, media profiles and presentation profiles for Virtual Reality. The present document builds on the findings and conclusions in TR 26.918.
This document specifies syntax and semantics of the data formats for interaction devices by providing a standardized format for interfacing actuators and sensors by defining XML schema-based language named Interaction Information Description Language (IIDL). IIDL provides a basic structure with common information for communication with various actuators and sensors in consistency. Device Command Vocabulary (DCV) is defined to provide a standardized format for commanding individual actuator, and Sensed Information Vocabulary (SIV) is defined to provide a standardized format for holding information from individual sensors either to get environmental information from real world or to influence virtual world objects using the acquired information on the basis of IIDL.
This document provides definitions of data types and tools, which are used in other parts of the ISO/IEC 23005 series, but are not specific to a single part. This document specifies syntax and semantics of the data types and tools common to the tools defined in the other parts of the ISO/IEC 23005 series, such as basic data types which are used as basic building blocks in more than one of the tools in the ISO/IEC 23005 series, colour-related basic types which are used in light and colour‑related tools to help in specifying colour-related characteristics of the devices or commands, and time stamp types which can be used in device commands, and sensed information to specify timing related information. Classification schemes, which provide semantics of words or terms and normative way of referencing them, are also defined in Annex A, if they are used in more than one part of the ISO/IEC 23005 series. The tools defined in this document are not intended to be used alone, but to be used as a part or as a supporting tool of other tools defined in other parts of the ISO/IEC 23005 series, except for the profile and level definitions. This document also contains standard profiles and levels to be used in specific application domains. The profile and level definitions include collection of tools from ISO/IEC 23005-2 and ISO/IEC 23005-5 with necessary constraints.