Citiverse

Available (349)

Showing 205 - 216 per page



Information technology - Computer graphics, image processing and environmental representation - Sensor representation in mixed and augmented reality

This document defines the framework and information reference model for representing sensor-based 3D mixed-reality worlds. It defines concepts, an information model, architecture, system functions, and how to integrate 3D virtual worlds and physical sensors in order to provide mixed-reality applications with physical sensor interfaces. It defines an exchange format necessary for transferring and storing data between physical sensor-based mixed-reality applications. This document specifies the following functionalities:

a) representation of physical sensors in a 3D scene;

b) definition of physical sensors in a 3D scene;

c) representation of functionalities of each physical sensor in a 3D scene;

d) representation of physical properties of each physical sensor in a 3D scene;

e) management of physical sensors in a 3D scene; and

f) interface with physical sensor information in a 3D scene.

This document defines a reference model for physical sensor-based mixed-reality applications to represent and to exchange functions of physical sensors in 3D scenes. It does not define specific physical interfaces necessary for manipulating physical devices, but rather defines common functional interfaces that can be used interchangeably between applications. This document does not define how specific applications are implemented with specific physical sensor devices. It does not include computer generated sensor information using computer input/output devices such as a mouse or a keyboard. The sensors in this document represent physical sensor devices in the real world.
ISO/IEC 18038:2020

Standard for Immersive Visual Content Coding

Efficient coding tool sets for compression, decompression, and reconstructing of the immersive visual content data is provided. The target applications and services include, but are not limited to, virtual reality (VR), such as unmanned aerial vehicle-based VR, augmented reality, panorama video, free-view TV, panoramic stereo video, and other video-/audio-enabled services and applications, such as immersive video streaming, broadcasting, storage, and communication.
IEEE 1857.9-2021

Ergonomics of human-system interaction - Part 430: Recommendations for the design of non-touch gestural input for the reduction of biomechanical stress

This document provides guidance on the design, selection and optimization of non-contacting hand and arm gestures for human-computer interaction. It addresses the assessment of usability and fatigue associated with different gesture set designs and provides recommendations for approaches to evaluating the design and selection of gestures. This document also provides guidance on the documentation of the process for selecting gesture sets. This document applies to gestures expressed by humans. It does not consider the technology for detecting gestures or the system response when interpreting a gesture. Non-contacting hand gestures can be used for input in a variety of settings, including the workplace or in public settings and when using fixed screens, mobile, virtual reality, augmented reality or mixed-mode reality devices.
ISO/TS 9241-430:2021

Building information models - Information delivery manual - Part 2: Interaction framework

ISO 29481-2:2012 specifies a methodology and format for describing coordination between actors in a building construction project during all life cycle stages. It therefore specifies: a methodology that describes an interaction framework, an appropriate way to map responsibilities and interactions that provides a process context for information flow; a format in which the interaction framework should be specified. ISO 29481-2:2012 is intended to facilitate interoperability between software applications used in the construction process, to promote digital collaboration between actors in the building construction process, and to provide a basis for accurate, reliable, repeatable, and high-quality information exchange.
ISO 29481-2:2012

Augmented and Virtual Reality safety - Guidance on safe immersion, setup and usage

The standard specifies how Augmented Reality and Virtual Reality (AR/VR) devices should be set up and used in the enterprise; in a manner that ensures Health and Safety (H&S) is maintained, H&S consequences are understood, and additional risks are not introduced. Within this concept of safe usage, there is particular focus on guidance around safe immersion (time) and safety in the workplace. This ISO/IEC standard:

(a) defines the concepts of AR, VR, the virtuality continuum and other associated terms such as Augmented Virtuality and Mixed Reality;

(b) provides guidance on setting up AR systems;

(c) provides guidance on setting up VR systems;

(d) provides guidance on safe usage and immersion in AR systems both in the consumer and enterprise domains; and

(e) provides guidance on safe usage and immersion in VR systems both in the consumer and enterprise domains.

This standard focuses on visual aspects of AR and VR. Other modes such as haptics and olfactory are not addressed within this standard. The standard covers both the hardware (the physical VR/AR head mounted displays) and areas of visual stimulus (the environments and graphics displayed in those headsets). The standard does not cover all possible visual stimulus scenarios; focus is directed toward those areas that are known to have implications on safe use. This specifically includes the source vection (visual illusion of self-motion in physically stationary VR/AR users) and/or motion (physical movement of VR/AR users) and associated safe use considerations. It should be noted that AR/VR have some shared safety concerns, but many are distinct to AR or VR and a consumer or enterprise environment. As such all of these are in scope, and the standard is structured to account for these differences.
ISO/IEC DIS 5927

Ergonomics of human-system interaction - Part 394: Ergonomic requirements for reducing undesirable biomedical effects of visually induced motion sickness during watching electronic images

This document establishes the requirements and recommendations for image contents and electronic display systems to reduce visually induced motion sickness (VIMS), while viewing images on electronic displays. This document is applicable to electronic display systems, including flat panel displays, projectors with a screen, and virtual reality (VR) type of head mounted displays (HMDs), but not including HMDs that present electronic images on/with real-world scenes.

NOTE 1: This document assumes the images are viewed under appropriate defined conditions. See Annex B for the appropriate viewing conditions.

NOTE 2: This document is useful for the design, development, and supply of image contents, as well as electronic displays for reducing VIMS.

NOTE 3 ISO 9241-392[3] provides guidelines for stereoscopic 3D displays, of which the methods are also used in HMDs.

NOTE 4 The International Telecommunication Union (ITU) generally sets the standards for broadcasting.
ISO 9241-394:2020

Ergonomics of human-system interaction - Part 910: Framework for tactile and haptic interaction

ISO 9241-910:2011 provides a framework for understanding and communicating various aspects of tactile/haptic interaction. It defines terms, describes structures and models, and gives explanations related to the other parts of the ISO 9241 ""900"" subseries. It also provides guidance on how various forms of interaction can be applied to a variety of user tasks. It is applicable to all types of interactive systems making use of tactile/haptic devices and interactions. It does not address purely kinaesthetic interactions, such as gestures, although it might be useful for understanding such interactions.
ISO 9241-910:2011

Ergonomics of human-system interaction - Part 920: Guidance on tactile and haptic interactions

ISO 9241-920:2009 gives recommendations for tactile and haptic hardware and software interactions. It provides guidance on the design and evaluation of hardware, software, and combinations of hardware and software interactions, including: the design/use of tactile/haptic inputs, outputs, and/or combinations of inputs and outputs, with general guidance on their design/use as well as on designing/using combinations of tactile and haptic interactions for use in combination with other modalities or as the exclusive mode of interaction; the tactile/haptic encoding of information, including textual data, graphical data and controls; the design of tactile/haptic objects, the layout of tactile/haptic space; interaction techniques. It does not provide recommendations specific to Braille, but can apply to interactions that make use of Braille. The recommendations given in ISO 9241-920:2009 are applicable to at least the controls of a virtual workspace, but they can also be applied to an entire virtual environment — consistent, in as far as possible, with the simulation requirements.
ISO 9241-920:2009

Ergonomics of human-system interaction - Part 940: Evaluation of tactile and haptic interactions

This document - describes the types of methods that can be used for the evaluation of haptic devices and of systems that include haptic devices, - specifies a procedure for the evaluation of haptic interactions by a usability walkthrough or usability test (see Annex J), and - provides guidance on the types of methods that are appropriate for the evaluation of specific attributes of haptic systems, cross-referenced to the guidance in the relevant clauses of other International Standards (see Annexes A, B, C, D, E, F and G). It applies to the following types of interaction: - augmented reality - information overlaid on a real scene, e.g. vibrating belt indicating distance; - gesture control of a device or a virtual scenario; - unidirectional interaction such as a vibrating phone or a vibrating belt; - virtual environment - virtual space with which a user can interact with the aid of a haptic device. This document applies to the following types of devices: - gesture sensor, e.g. video that discerns 3D hand movements, touch screens that sense 2D touches; - kinaesthetic haptic device, e.g. desktop haptic interface; - tactile display, e.g. vibrating phone. This document is not applicable to standard input devices such as keyboards, mice or track balls.

NOTE: ISO 9241-400 covers standard input devices, and ISO 9241-411 applies to the evaluation of input devices such as keyboards and mice. This document can be used to identify the types of methods and measures for:

- establishing benchmarks

- establishing requirements for haptic interaction

- identifying problems with haptic interaction (formative evaluation), and

- use of the criteria to establish whether a haptic system meets requirements (summative evaluation).
ISO 9241-940:2017

Ergonomics of human-system interaction - Part 380: Survey result of HMD (Head-Mounted Displays) characteristics related to human-system interaction

This document provides information based on a study of the characteristics of head-mounted displays (HMDs) regarding the ergonomics of human-system interaction. Although this document covers the broad range of ergonomics issues that arise, it specifically provides more-detailed information about the visual aspects of the interaction, and it provides information that could form the basis for future possible standards related to HMDs.

NOTE: It is preferable to take systematic approach to consider characteristics of HMD, since HMD affects a viewer not only by visual aspects, but also by some other physical aspects.
ISO/TR 9241-380:2022

Ergonomics of human-system interaction - Part 393: Structured literature review of visually induced motion sickness during watching electronic images

This document gives the scientific summaries of visually induced motion sickness resulting from images presented visually on or by electronic display devices. Electronic displays include flat panel displays, electronic projections on a flat screen, and head-mounted displays. Different aspects of human-system interaction are covered in other parts of the ISO 9241 series (see Annex A).
ISO/TR 9241-393:2020

Information technology - Multimedia application format (MPEG-A) - Part 13: Augmented reality application format

ISO/IEC 23000-13:2017 specifies the following:

- scene description elements for representing AR content; and

- mechanisms to connect to local and remote sensors and actuators; mechanisms to integrated compressed media (image, audio, video, graphics); mechanisms to connect to remote resources such as maps and compressed media.
ISO/IEC 23000-13:2017