Welcome to the ISO/IEC JTC 1/SC 29/WG 11 web site

also know as MPEG, the Moving Picture Experts Group.

The MPEG acronym is also used to indicate a suite of

ISO/IEC digital media standards developed by this JTC 1 Working Group.

iso-iec-logo

The Moving Picture Experts Group

Control Information

Standard: 
Part number: 
2
Activity status: 
Open

  N11995March 2011, Geneva, SwitzerlandSourceSystems/3DGStatusApprovedTitleOverview of AuthorKyoungro Yoon (Konkuk University)This standard specifies syntax and semantics of the tools required to provide interoperability in controlling devices in real as well as virtual world.It covers the interfaces between the adaptation engine and the capability descriptions of actuators/sensors (SDC/SC) in the real world and the user’s sensory preference information (USEP), which characterizes devices and users, so that appropriate information to control individual devices (actuators and sensors) for individual users can be generated. In other words it covers user’s sensory effect preferences, sensory device capabilities, and sensor capabilities.These tools are defined using XML Schema.The overall structure of the tools is organized and defined by specifying Control Information Description Language (CIDL). The CIDL enables the instantiation of three types of descriptions.The actual descriptions (actuator capabilities, sensor capabilities, and user’s sensory effect preferences) are not part of CIDL, but defined as the Device Capability Description Vocabulary (DCDV), Sensor Capability Description Vocabulary (SCDV), and User’s Sensory Effect Preference Vocabulary (SEPV), respectively.

MPEG-V Control information

MPEG doc#: N11995
Date: March 2011
Author: Kyoungro Yoon (Konkuk University)

 

MPEG-V Part 2 specifies syntax and semantics of the tools required to provide interoperability in controlling devices in real as well as virtual worlds.

The adaptation engine (Real-Virtual or Virtual-Real engine), which is not within the scope of standardization, takes five inputs: sensory effects (SE), user’s sensory effect preferences (USEP), sensory devices capabilities (SDC), sensor capability (SC), and sensed information (SI) and outputs sensory devices commands (SDC) and/or sensed information (SI) to control the devices in real world or the parameters of the virtual world objects.

The scope of MPEG-V Part 2, illustrated in Figure 1, covers the interfaces between the adaptation engine and the capability descriptions of actuators/sensors in the real world and the user’s sensory preference information, which characterizes devices and users. Therefore the appropriate information to control devices (actuators and sensors) can be generated from the sensory effects. In other words, user’s sensory preferences, sensory device capabilities, and sensor capabilities are within the scope of this part of MPEG-V. The control information including the user's sensory preference information, device capability description, and sensor capability description can be used to fine tune the sensed information and the device command for the control of virtual/real world by providing extra information to the adaptation engine.

Figure 1. Scope of the Control Information

MPEG-V Part 2 is composed as follows:

  • A tool called Control Information Description Language (CIDL) is defined to provide a basic structure of the description of control information;
  • A set of tools called Device Capability Description Vocabulary (DCDV) is defined to provide interface for describing capability of each individual sensory device;
  • A set of tools called Sensor Capability Description Vocabulary (SCDV) is defined to provide interface for describing capability of each individual sensor;
  • A set of tools called Sensory Effect Preference Vocabulary (SEPV) is defined to provide interface for describing preference of individual user on specific sensory effect.