The Moving Picture Experts Group

Workshop on standard coding technologies for immersive audio and visual experiences

Date: 
Wednesday, 10 July 2019 - 1:00pm to 6:00pm
Venue: 
Clarion Post Hotel
Drottningtorget 10
411 03 Gothenburg
Sweden

MPEG aims at standardizing coding solutions for the digital representation of light fields, which contain color and directional light information from any point in space. The objective is to support immersive applications - virtual and augmented reality - with the highest level of visual comfort with parallax cues that are required in natural vision.

Since any digital capture will subsample the light field in which we are immersed, various capture technologies (omnidirectional and plenoptic cameras, camera arrays, etc.) as well as display technologies (head mounted devices, light field displays, integral photography displays, etc.) lead to a wide range of coding technologies to be explored. Some will be selected for further standardization.

MPEG has already standardized 360° video - also called omnidirectional video - in OMAF version 1, and is in the process of standardizing extensions in 3DoF+ OMAF version 2 by mid-2020, bringing natural vision parallax cues, albeit within a limited range of viewer motion. Full parallax of dynamic object is supported in the first version of Point Cloud Coding ready by early 2020, while coding of 6DoF virtual reality over large navigation volumes is further studied for standardization by 2023.

This workshop will cover the MPEG-I immersive activities - past, present and future – calling participants to present demos and future requirements to the MPEG community.

Title: Standard coding technologies for immersive audio-visual experiences

Date: 10 July, 2019

Address: MPEG meeting venue 

Clarion Post Hotel

Drottningtorget 10, 411 03 Gothenburg, Sweden

1300-1315

Introduction

(Lu Yu, Zhejiang University)

1315-1345

Usecases and challenges about user immersive experiences

(Valerie Allie, InterDigital)

1345-1415

Overview of technologies for immersive visual experiences: capture, processing, compression, standardization and display

(Marek Domanski, Poznan University of Technology)

1415-1445

MPEG-I Immersive Audio

(Schuyler Quackenbush, Audio Research Labs)

1445-1455

Brief introduction about demos:

  • Integral photography display (NHK)
  • Realtime interactive demo with 3DoF+ content (InterDigital)
  • Plenoptic 2.0 video camera (Tsinghua University)
  • A simple free-viewpoint television system (Poznan University of Technology)

1455-1530

Demos

Coffee break

1530-1600

360° and 3DoF+ video

(Bart Kroon, Philips)

1600-1630

Point cloud compression

(Marius Preda, Telecom SudParis, CNRS Samovar)

1630-1700

How can we achieve 6DoF video compression?

(Joel Jung, Orange)

1700-1730

How can we achieve lenslet video compression?

(Xin Jin, Tsinghua University,

Mehrdad Teratani, Nagoya University)

1730-1800

Discussion