Standard

Available (2726)

Showing 385 - 396 per page



Ergonomics of human-system interaction - Part 920: Guidance on tactile and haptic interactions

ISO 9241-920:2009 gives recommendations for tactile and haptic hardware and software interactions. It provides guidance on the design and evaluation of hardware, software, and combinations of hardware and software interactions, including: the design/use of tactile/haptic inputs, outputs, and/or combinations of inputs and outputs, with general guidance on their design/use as well as on designing/using combinations of tactile and haptic interactions for use in combination with other modalities or as the exclusive mode of interaction; the tactile/haptic encoding of information, including textual data, graphical data and controls; the design of tactile/haptic objects, the layout of tactile/haptic space; interaction techniques. It does not provide recommendations specific to Braille, but can apply to interactions that make use of Braille. The recommendations given in ISO 9241-920:2009 are applicable to at least the controls of a virtual workspace, but they can also be applied to an entire virtual environment — consistent, in as far as possible, with the simulation requirements.
ISO 9241-920:2009

Ergonomics of human-system interaction - Part 910: Framework for tactile and haptic interaction

ISO 9241-910:2011 provides a framework for understanding and communicating various aspects of tactile/haptic interaction. It defines terms, describes structures and models, and gives explanations related to the other parts of the ISO 9241 ""900"" subseries. It also provides guidance on how various forms of interaction can be applied to a variety of user tasks. It is applicable to all types of interactive systems making use of tactile/haptic devices and interactions. It does not address purely kinaesthetic interactions, such as gestures, although it might be useful for understanding such interactions.
ISO 9241-910:2011

Ergonomics of human-system interaction - Part 394: Ergonomic requirements for reducing undesirable biomedical effects of visually induced motion sickness during watching electronic images

This document establishes the requirements and recommendations for image contents and electronic display systems to reduce visually induced motion sickness (VIMS), while viewing images on electronic displays. This document is applicable to electronic display systems, including flat panel displays, projectors with a screen, and virtual reality (VR) type of head mounted displays (HMDs), but not including HMDs that present electronic images on/with real-world scenes.

NOTE 1: This document assumes the images are viewed under appropriate defined conditions. See Annex B for the appropriate viewing conditions.

NOTE 2: This document is useful for the design, development, and supply of image contents, as well as electronic displays for reducing VIMS.

NOTE 3 ISO 9241-392[3] provides guidelines for stereoscopic 3D displays, of which the methods are also used in HMDs.

NOTE 4 The International Telecommunication Union (ITU) generally sets the standards for broadcasting.
ISO 9241-394:2020

Augmented and Virtual Reality safety - Guidance on safe immersion, setup and usage

The standard specifies how Augmented Reality and Virtual Reality (AR/VR) devices should be set up and used in the enterprise; in a manner that ensures Health and Safety (H&S) is maintained, H&S consequences are understood, and additional risks are not introduced. Within this concept of safe usage, there is particular focus on guidance around safe immersion (time) and safety in the workplace. This ISO/IEC standard:

(a) defines the concepts of AR, VR, the virtuality continuum and other associated terms such as Augmented Virtuality and Mixed Reality;

(b) provides guidance on setting up AR systems;

(c) provides guidance on setting up VR systems;

(d) provides guidance on safe usage and immersion in AR systems both in the consumer and enterprise domains; and

(e) provides guidance on safe usage and immersion in VR systems both in the consumer and enterprise domains.

This standard focuses on visual aspects of AR and VR. Other modes such as haptics and olfactory are not addressed within this standard. The standard covers both the hardware (the physical VR/AR head mounted displays) and areas of visual stimulus (the environments and graphics displayed in those headsets). The standard does not cover all possible visual stimulus scenarios; focus is directed toward those areas that are known to have implications on safe use. This specifically includes the source vection (visual illusion of self-motion in physically stationary VR/AR users) and/or motion (physical movement of VR/AR users) and associated safe use considerations. It should be noted that AR/VR have some shared safety concerns, but many are distinct to AR or VR and a consumer or enterprise environment. As such all of these are in scope, and the standard is structured to account for these differences.
ISO/IEC DIS 5927

Standard for Immersive Visual Content Coding

Efficient coding tool sets for compression, decompression, and reconstructing of the immersive visual content data is provided. The target applications and services include, but are not limited to, virtual reality (VR), such as unmanned aerial vehicle-based VR, augmented reality, panorama video, free-view TV, panoramic stereo video, and other video-/audio-enabled services and applications, such as immersive video streaming, broadcasting, storage, and communication.
IEEE 1857.9-2021

Augmented Reality Framework (ARF); AR framework architecture

The present document specifies a functional reference architecture for AR components, systems and services. The structure of this architecture and the functionalities of its components have been derived from a collection of use cases, ETSI GR ARF 002 (V1.1.1): Augmented Reality Framework (ARF) Industrial use cases for AR applications and services, and an overview of the current landscape of AR standards, ETSI GR ARF 001 (V1.1.1): Augmented Reality Framework (ARF); AR standards landscape. The present document introduces the characteristics of an AR system and describes the functional building blocks of the AR reference architecture and their mutual relationships. The generic nature of the architecture is validated by mapping the workflow of several use cases to the components of this framework architecture.
ETSI GS ARF 003 V1.1.1

Information technology - Multimedia application format (MPEG-A) - Part 13: Augmented reality application format

ISO/IEC 23000-13:2017 specifies the following:

- scene description elements for representing AR content; and

- mechanisms to connect to local and remote sensors and actuators; mechanisms to integrated compressed media (image, audio, video, graphics); mechanisms to connect to remote resources such as maps and compressed media.
ISO/IEC 23000-13:2017

Information technology - Computer graphics, image processing and environmental representation - Sensor representation in mixed and augmented reality

This document defines the framework and information reference model for representing sensor-based 3D mixed-reality worlds. It defines concepts, an information model, architecture, system functions, and how to integrate 3D virtual worlds and physical sensors in order to provide mixed-reality applications with physical sensor interfaces. It defines an exchange format necessary for transferring and storing data between physical sensor-based mixed-reality applications. This document specifies the following functionalities:

a) representation of physical sensors in a 3D scene;

b) definition of physical sensors in a 3D scene;

c) representation of functionalities of each physical sensor in a 3D scene;

d) representation of physical properties of each physical sensor in a 3D scene;

e) management of physical sensors in a 3D scene; and

f) interface with physical sensor information in a 3D scene.

This document defines a reference model for physical sensor-based mixed-reality applications to represent and to exchange functions of physical sensors in 3D scenes. It does not define specific physical interfaces necessary for manipulating physical devices, but rather defines common functional interfaces that can be used interchangeably between applications. This document does not define how specific applications are implemented with specific physical sensor devices. It does not include computer generated sensor information using computer input/output devices such as a mouse or a keyboard. The sensors in this document represent physical sensor devices in the real world.
ISO/IEC 18038:2020

Information technology - Runtime 3D asset delivery format - Khronos glTF™ 2.0

This document describes the glTF file format. glTF is an API-neutral runtime asset delivery format. glTF bridges the gap between 3D content creation tools and modern graphics applications by providing an efficient, extensible, interoperable format for the transmission and loading of 3D content.
ISO/IEC 12113:2022

Information technology - Computer graphics and image processing - Extensible 3D (X3D) language bindings - Part 2: Java

The Extensible 3D (X3D) specification, ISO/IEC 19775, specifies a language-independent application programmer interface (API) to a set of services and functions. For integration into a programming language, the X3D abstract interfaces are embedded in a language dependent layer obeying the particular conventions of that language. ISO/IEC 19777-2:2006 specifies such a language-dependent layer for the Java programming language.
ISO/IEC 19777-2:2006

Information technology - Computer graphics and image processing - Extensible 3D (X3D) language bindings - Part 1: ECMAScript

For integration into a programming language, the X3D abstract interfaces are embedded in a language-dependent layer obeying the particular conventions of that language. ISO/IEC 19777-1:2006 specifies such a language dependent layer for the ECMAScript language. ISO/IEC 19775-2 specifies a language-independent application programmer interface (API) to a set of services and functions.
ISO/IEC 19777-1:2006

Information technology - Computer graphics, image processing and environmental data representation - Mixed and augmented reality (MAR) reference model

This document defines the scope and key concepts of mixed and augmented reality, the relevant terms and their definitions and a generalized system architecture that together serve as a reference model for mixed and augmented reality (MAR) applications, components, systems, services and specifications. This architectural reference model establishes the set of required sub-modules and their minimum functions, the associated information content and the information models to be provided and/or supported by a compliant MAR system. The reference model is intended for use by current and future developers of MAR applications, components, systems, services or specifications to describe, compare, contrast and communicate their architectural design and implementation. The MAR reference model is designed to apply to MAR systems independent of specific algorithms, implementation methods, computational platforms, display systems and sensors or devices used. This document does not specify how a particular MAR application, component, system, service or specification is designed, developed or implemented. It does not specify the bindings of those designs and concepts to programming languages or the encoding of MAR information through any coding technique or interchange format. This document contains a list of representative system classes and use cases with respect to the reference model.
ISO/IEC 18039:2019