Support for 360° video, also called omnidirectional video, has been standardized in Omnidirectional Media Format (OMAF) and in Supplemental Enhancement Information (SEI) messages defined for HEVC. These standards can be used for delivering immersive visual content. However, rendering flat 360° video may generate visual discomfort when objects close to the viewer are rendered: the world around the viewer seems flattened. Interactive parallax provides viewers with cues to their visual system, resulting in an enhanced perception of volume around them. The interactive parallax feature of 3DoF+ will provide viewers with visual content that closely mimics natural vision, but within a limited range of viewer motion.
Compared to traditional 3DoF virtual reality (VR) experiences where viewers are limited to rotational movements around the X, Y, and Z axes (pitch, yaw, and roll respectively), 3DoF+ introduces additional limited translational movement along the X, Y, and Z axes. A typical 3DoF+ use case is a user sitting on a chair looking at stereoscopic omnidirectional VR content on a head mounted display (HMD) with the capability to slightly move his head up/down, left/right and forward/backward.
As described in MPEG-I architectures Technical Report, head motion parallax feature can be achieved by using colour videos, depth information and associated metadata. As such, it is expected that the MPEG-I solution for the coding of 3DoF+ videos will be built on the existing HEVC standard for video and depth information while 3DoF+ metadata will be standardized in MPEG-I part 7. It will be referenced at the systems level in OMAF and at the video level in SEI messages for HEVC.