For integration into a programming language, the X3D abstract interfaces are embedded in a language-dependent layer obeying the particular conventions of that language. ISO/IEC 19777-1:2006 specifies such a language dependent layer for the ECMAScript language. ISO/IEC 19775-2 specifies a language-independent application programmer interface (API) to a set of services and functions.
The Extensible 3D (X3D) specification, ISO/IEC 19775, specifies a language-independent application programmer interface (API) to a set of services and functions. For integration into a programming language, the X3D abstract interfaces are embedded in a language dependent layer obeying the particular conventions of that language. ISO/IEC 19777-2:2006 specifies such a language-dependent layer for the Java programming language.
This document describes the glTF file format. glTF is an API-neutral runtime asset delivery format. glTF bridges the gap between 3D content creation tools and modern graphics applications by providing an efficient, extensible, interoperable format for the transmission and loading of 3D content.
This document defines the framework and information reference model for representing sensor-based 3D mixed-reality worlds. It defines concepts, an information model, architecture, system functions, and how to integrate 3D virtual worlds and physical sensors in order to provide mixed-reality applications with physical sensor interfaces. It defines an exchange format necessary for transferring and storing data between physical sensor-based mixed-reality applications. This document specifies the following functionalities:a) representation of physical sensors in a 3D scene;b) definition of physical sensors in a 3D scene;c) representation of functionalities of each physical sensor in a 3D scene;d) representation of physical properties of each physical sensor in a 3D scene;e) management of physical sensors in a 3D scene; andf) interface with physical sensor information in a 3D scene.This document defines a reference model for physical sensor-based mixed-reality applications to represent and to exchange functions of physical sensors in 3D scenes. It does not define specific physical interfaces necessary for manipulating physical devices, but rather defines common functional interfaces that can be used interchangeably between applications. This document does not define how specific applications are implemented with specific physical sensor devices. It does not include computer generated sensor information using computer input/output devices such as a mouse or a keyboard. The sensors in this document represent physical sensor devices in the real world.
This standard defines the basic framework within the overall architectural framework for motion training systems. This standard includes definitions for the functions and input/output interfaces of each component module, and the related data components and formats. To utilize this standard basic framework, various application-specific user interface/user experience (UI/UX) and service frameworks are specified. The meaning of "motion training" includes:- Human gestures and postures- Animated human gestures and postures- Expressions of human body animation for Virtual Reality (VR) and Mixed Reality (MR).
This document defines a set of concepts and their inter-relationships which should be applicable to the complete range of future computer graphics standard. May be applied to verify and refine requirements for computer graphics; to identify needs for computer graphics standards and external interfaces; to develop models based on requirements for computer graphics; to define the architecture of new computer graphics standards; compare computer graphics standards.
The present document collects information on glass-type AR/MR devices in the context of 5G radio and network services. The primary scope of this Technical Report is the documentation of the following aspects:- providing formal definitions for the functional structures of AR glasses, including their capabilities and constraints,- documenting core use cases for AR services over 5G and defining relevant processing functions and reference architectures,- identifying media exchange formats and profiles relevant to the core use cases,- identifying necessary content delivery transport protocols and capability exchange mechanisms, as well as suitable 5G system functionalities (including device, edge, and network) and required QoS (including radio access and core network technologies),- identifying key performance indicators and quality of experience factors,- identifying relevant radio and system parameters (required bitrates, latencies, loss rates, range, etc.) to support the identified AR use cases and the required QoE, and- providing a detailed overall power analysis for media AR related processing and communication.
This document analyses visualization elements that are key components of the interface between the physical asset and the avatar (digital replica of the physical asset).
ISO 14306:2017 defines the syntax and semantics of a file format for the 3D visualization and interrogation of lightweight geometry and product manufacturing information derived from CAD systems, using visualization software tools that do not need the full capability of a CAD system. ISO 14306:2017 has been adopted as a 3D visualization capability in addition to the ISO 10303 series. The file format supports the following information:(1) facet information (triangles), stored with geometry compression techniques(2) visual attributes such as lights, textures and materials(3) product manufacturing information, such as dimensions, tolerances and other attributes(4) boundary representation (b-rep) solid model shape representations. Several alternatives are available, including a representation based on the geometry standard defined in ISO 10303(5) configuration representations(6) delivery methods such as asynchronous streaming of content ISO 14306:2017 does not specify the implementation of, or definition of a run-time architecture for viewing or processing of the file format.
This document is the first of a family of standards. ISO 12637-1:2006 defines a set of fundamental terms that can be used in the drafting of other International Standards for graphic technology. In order to facilitate their translation into other languages, the definitions are worded so as to avoid, where possible, any peculiarity attached to one language. The entries in ISO 12637-1:2006 are arranged alphabetically.
ISO/IEC 19775, X3D, defines a software system that integrates network-enabled 3D graphics and multimedia. Conceptually, each X3D application is a 3D time-based space that contains graphic and aural objects that can be dynamically modified through a variety of mechanisms. ISO/IEC 19775-1:2013 defines the architecture and base components of X3D. The semantics of X3D describe an abstract functional behaviour of time-based, interactive 3D, multimedia information. ISO/IEC 19775-1:2013 does not define physical devices or any other implementation-dependent concepts (e.g. screen resolution and input devices). It is intended for a wide variety of devices and applications, and provides wide latitude in interpretation and implementation of the functionality. For example, it does not assume the existence of a mouse or 2D display device. Each X3D application:(1) implicitly establishes a world coordinate space for all objects defined, as well as all objects included by the application;(2) explicitly defines and composes a set of 3D and multimedia objects;(3) can specify hyperlinks to other files and applications;(4) can define programmatic or data-driven object behaviours;(5) can connect to external modules or applications via programming and scripting languages;(6) explicitly declares its functional requirements by specifying a profile; and(7) can declare additional functional requirements by specifying components.
This document defines a reference model and base components for representing and controlling a single LAE or multiple LAEs in an MAR scene. It defines concepts, a reference model, system framework, functions and how to integrate a 2D/3D virtual world and LAEs, and their interfaces, in order to provide MAR applications with interfaces of LAEs. It also defines an exchange format necessary for transferring and storing LAE-related data between LAE-based MAR applications. This document specifies the following functionalities: a) definitions for an LAE in MAR; b) representation of an LAE; c) representation of properties of an LAE; d) sensing of an LAE in a physical world; e) integration of an LAE into a 2D/3D virtual scene; f) interaction between an LAE and objects in a 2D/3D virtual scene; g) transmission of information related to an LAE in an MAR scene. This document defines a reference model for LAE representation-based MAR applications to represent and to exchange data related to LAEs in a 2D/3D virtual scene in an MAR scene. It does not define specific physical interfaces necessary for manipulating LAEs, that is, it does not define how specific applications need to implement a specific LAE in an MAR scene, but rather defines common functional interfaces for representing LAEs that can be used interchangeably between MAR applications.