MPEG-V (Media context and control), first published in ISO/IEC 23005 and the 4th edition under way, provides an architecture and specifies associated information representations to enable the interoperability between virtual worlds, e.g., digital content providers of a virtual world, (serious) gaming, simulation, and with the real world, e.g., sensors, actuators, vision and rendering, robotics. MPEG-V is applicable in various business models/domains for which audiovisual contents can be associated with sensorial effects that need to be rendered on appropriate actuators and/or benefit from well-defined interaction with an associated virtual world
Media Context and Control
MPEG-V provides an architecture and specifies associated information representations to enable interoperability between virtual worlds and between real and virtual worlds. This standard outlines the Architecture, and lists the components, APIs and use cases
This standard specifies syntax and semantics of the tools required to provide interoperability in controlling devices in real as well as virtual world and covers ser’s sensory effect preferences, sensory device capabilities, and sensor capabilities .
This standards specifies the digital representation of information for senses other than vision or audition, e.g., olfaction, mechanoreception, equilibrioception, thermo(re)ception, or proprioception.
Standard for a set of types used to characterize metadata of virtual world objects (avatars) to make it possible to migrate a virtual object or its characteristics from one virtual world to another and control a virtual world object in a virtual world by real word devices.
Standard for syntax and semantics of the data formats for interaction devices, Device Commands and Sensed Information, required for providing interoperability in controlling and sensing interaction devices.
Standard for syntax and semantics of datatypes and tools common to the tools defined in other parts of MPEG-V
Reference Software & Conformance for MPEG-V