FAQ https://mpeg.chiariglione.org/faq/mp7.htm?title=&title_1= en Which layers are passed, before MPEG-4 objects are composed? https://mpeg.chiariglione.org/faq/which-layers-are-passed-mpeg-4-objects-are-composed <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">There are three layers in an audio-visual terminal that should be mentioned: delivery layer, sync layer and compression layer.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/which-layers-are-passed-mpeg-4-objects-are-composed" rel="tag" title="Which layers are passed, before MPEG-4 objects are composed?">Read more<span class="element-invisible"> about Which layers are passed, before MPEG-4 objects are composed?</span></a></li></ul> Sun, 03 Feb 2013 17:03:14 +0000 leonardo 305 at https://mpeg.chiariglione.org What is the MPEG-4 TransMux? https://mpeg.chiariglione.org/faq/what-mpeg-4-transmux <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The TransMux is an outdated synonym for a part of what we now call delivery layer. This change in terminology happened in order to align the Systems and DMIF parts of the standard.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-mpeg-4-transmux" rel="tag" title="What is the MPEG-4 TransMux?">Read more<span class="element-invisible"> about What is the MPEG-4 TransMux?</span></a></li></ul> Sun, 03 Feb 2013 17:04:13 +0000 leonardo 306 at https://mpeg.chiariglione.org What is the MPEG-4 delivery layer? https://mpeg.chiariglione.org/faq/what-mpeg-4-delivery-layer <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>It is a generic abstraction for delivery mechanisms (computer networks, etc.) able to store or transmit a number of multiplexed elementary streams or FlexMux streams. To allow for maximum flexibility for service creation and application design, it is not specified by MPEG-4. The interface, however, to the delivery layer is well defined, thus allowing transmission of MPEG-4 content over any type of transport layer facility (e.g., ITU-T Recommendations H.22x, MPEG-2 Transport Stream, IETF RTP).</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-mpeg-4-delivery-layer" rel="tag" title="What is the MPEG-4 delivery layer?">Read more<span class="element-invisible"> about What is the MPEG-4 delivery layer?</span></a></li></ul> Sun, 03 Feb 2013 17:05:09 +0000 leonardo 307 at https://mpeg.chiariglione.org What is the MPEG-4 FlexMux? https://mpeg.chiariglione.org/faq/what-mpeg-4-flexmux <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The FlexMux is a tool that provides a flexible way of interleaving packets of data. It is not meant to be robust to errors, because it can be layered on top of a robust transport layer. The FlexMux is fully defined by MPEG-4, but its use is optional: applications can operate directly on top of a traditional transport layer (formerly called "TransMux") if they so desire.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-mpeg-4-flexmux" rel="tag" title="What is the MPEG-4 FlexMux?">Read more<span class="element-invisible"> about What is the MPEG-4 FlexMux?</span></a></li></ul> Sun, 03 Feb 2013 17:06:16 +0000 leonardo 308 at https://mpeg.chiariglione.org What are the MPEG "object descriptors"? https://mpeg.chiariglione.org/faq/what-are-mpeg-object-descriptors <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The ESM part of Systems also specifies means to identify and name elementary streams so that they can be referred to in a scene description and be attached to individual objects. This association is performed in object descriptors that are transmitted in their own elementary streams. Object descriptors are separate from the scene description itself, thus simplifying editing and remultiplexing of MPEG-4 content. The descriptors associate audio-visual objects, more precisely, nodes in the scene to elementary stream identifiers.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-mpeg-object-descriptors" rel="tag" title="What are the MPEG &quot;object descriptors&quot;?">Read more<span class="element-invisible"> about What are the MPEG &quot;object descriptors&quot;?</span></a></li></ul> Sun, 03 Feb 2013 17:07:53 +0000 leonardo 309 at https://mpeg.chiariglione.org What's the MPEG-4 Initial Object Descriptor (IOD)? https://mpeg.chiariglione.org/faq/whats-mpeg-4-initial-object-descriptor-iod <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The IOD is an object descriptor that does not only describe a set of elementary streams, but it also conveys the set of profile and level information that is needed by a receiver to assess the processing resources needed for that content.<br /> It is called initial OD since at least the very first OD that is used to access the MPEG-4 content needs to be such an IOD. However, if sub-scenes of a presentation are generated separately, they may as well start with an IOD (attached to a BIFS Inline node)</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/whats-mpeg-4-initial-object-descriptor-iod" rel="tag" title="What&#039;s the MPEG-4 Initial Object Descriptor (IOD)?">Read more<span class="element-invisible"> about What&#039;s the MPEG-4 Initial Object Descriptor (IOD)?</span></a></li></ul> Sun, 03 Feb 2013 17:08:58 +0000 leonardo 310 at https://mpeg.chiariglione.org What is "Intellectual Property" in the context of MPEG-4? https://mpeg.chiariglione.org/faq/what-intellectual-property-context-mpeg-4 <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Intellectual property describes the efforts of creators to produce works, including ideas, information, inventions and expressions that may be manifested in digital content. Such creative material has a measurable value and these are generally expressed as "rights". The Intellectual Property Right (IPR) is the legal authority afforded by copyright protection defined by national, and increasingly, international, legislation, granted to creators and their appointed agents.<br /></p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-intellectual-property-context-mpeg-4" rel="tag" title="What is &quot;Intellectual Property&quot; in the context of MPEG-4?">Read more<span class="element-invisible"> about What is &quot;Intellectual Property&quot; in the context of MPEG-4?</span></a></li></ul> Sun, 03 Feb 2013 17:26:41 +0000 leonardo 326 at https://mpeg.chiariglione.org What does ‘Management of the MPEG-4 Receiving Terminal's Buffer’ mean? https://mpeg.chiariglione.org/faq/what-does-%E2%80%98management-mpeg-4-receiving-terminals-buffer%E2%80%99-mean <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>To predict how the decoder will behave when decoding the various elementary data streams that form an MPEG-4 session, Systems provide a Systems Decoder Model. This model helps to provide a well defined framework in which the receiver's behaviour can be unambiguously characterized. Use of this model enables the encoder to monitor the buffer resources that are used to decode the session and ensure that they are not exceeded. The required buffer resources are conveyed to the decoder at the beginning of a session, so that the decoder can decide whether it is capable to provide them.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-does-%E2%80%98management-mpeg-4-receiving-terminals-buffer%E2%80%99-mean" rel="tag" title="What does ‘Management of the MPEG-4 Receiving Terminal&#039;s Buffer’ mean?">Read more<span class="element-invisible"> about What does ‘Management of the MPEG-4 Receiving Terminal&#039;s Buffer’ mean?</span></a></li></ul> Sun, 03 Feb 2013 17:10:10 +0000 leonardo 311 at https://mpeg.chiariglione.org What is meant by IP "Management & Protection"? https://mpeg.chiariglione.org/faq/what-meant-ip-management-protection <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>With the advancement of electronic commerce and the ability to replicate perfect copies of digital material over global networks, IPR owners must establish mechanisms to control how their copyright creations are exploited. This is achieved in two ways.</p> <p>Firstly, by the identification of the individual IPR components of a digital creation and the management of these identifiers by persistently associating them with the digital object they identify. Please refer to Document WG11/N1918 for further information on the identification of intellectual property.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-meant-ip-management-protection" rel="tag" title="What is meant by IP &quot;Management &amp; Protection&quot;?">Read more<span class="element-invisible"> about What is meant by IP &quot;Management &amp; Protection&quot;?</span></a></li></ul> Sun, 03 Feb 2013 17:28:32 +0000 leonardo 327 at https://mpeg.chiariglione.org What are the assumptions of the MPEG-4 timing model? https://mpeg.chiariglione.org/faq/what-are-assumptions-mpeg-4-timing-model <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>For applications involving real-time transmission, the timing model adopted by MPEG-4 assumes a constant end-to-end delay from the output of the encoder to the input of the decoder. This is only done so that a well defined and verifiably correct model is employed. It does not mean that MPEG-4 content cannot be transmitted over variable delay networks.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-assumptions-mpeg-4-timing-model" rel="tag" title="What are the assumptions of the MPEG-4 timing model?">Read more<span class="element-invisible"> about What are the assumptions of the MPEG-4 timing model?</span></a></li></ul> Sun, 03 Feb 2013 17:11:03 +0000 leonardo 312 at https://mpeg.chiariglione.org What should I use timing information for?MPEG-4 https://mpeg.chiariglione.org/faq/what-should-i-use-timing-information-formpeg-4 <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>There are two kinds of timing information that can be conveyed in elementary streams. The first set is used to convey the sender's time base to the receiver (clock references) and the second contains the desired time (in units of the sender's time base) for specific events such as the desired decoding or composition time for portions of the encoded audio-visual information. With this timing information, the inter picture interval and audio sample rate can be adjusted at the decoder to match the encoder's inter picture interval and audio sample rate for synchronized operation.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-should-i-use-timing-information-formpeg-4" rel="tag" title="What should I use timing information for?MPEG-4 ">Read more<span class="element-invisible"> about What should I use timing information for?MPEG-4 </span></a></li></ul> Sun, 03 Feb 2013 17:12:12 +0000 leonardo 313 at https://mpeg.chiariglione.org What are MPEG-4 Systems 'profiles and levels'? https://mpeg.chiariglione.org/faq/what-are-mpeg-4-systems-profiles-and-levels <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Profiles are a mechanism to establish some well-defined subsets of the overall MPEG-4 functionality. Profiles are defined in multiple dimensions. In the context of MPEG-4 Systems Scene Graph Profiles and OD Profiles are distinguished.<br /> While profiles establish a subset of MPEG-4 tools, levels put constraints on the parameters of these tools. Therefore the combination of profiles and levels help to determine the necessary processing power and memory requirements for an application.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-mpeg-4-systems-profiles-and-levels" rel="tag" title="What are MPEG-4 Systems &#039;profiles and levels&#039;?">Read more<span class="element-invisible"> about What are MPEG-4 Systems &#039;profiles and levels&#039;?</span></a></li></ul> Sun, 03 Feb 2013 17:13:19 +0000 leonardo 314 at https://mpeg.chiariglione.org Which MPEG-4 Systems profiles are distinguished? https://mpeg.chiariglione.org/faq/which-mpeg-4-systems-profiles-are-distinguished <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">In the context of MPEG-4 Systems Scene Graph Profiles and OD Profiles are distinguished. Several scene graph profiles are currently defined: Audio, Simple2D, Complete2D and Complete.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/which-mpeg-4-systems-profiles-are-distinguished" rel="tag" title="Which MPEG-4 Systems profiles are distinguished?">Read more<span class="element-invisible"> about Which MPEG-4 Systems profiles are distinguished?</span></a></li></ul> Sun, 03 Feb 2013 17:14:17 +0000 leonardo 315 at https://mpeg.chiariglione.org What is the MPEG-4 model of an audio-visual object? https://mpeg.chiariglione.org/faq/what-mpeg-4-model-audio-visual-object <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>In the MPEG-4 model, audio-visual objects have both a spatial and a temporal extent. Temporally, all AV objects have a single dimension. Each AV object has a local coordinate system in which the object has a fixed spatio-temporal location and scale. AV objects are positioned in a scene by specifying one or more coordinate transformations from the object's local coordinate system into a common, global coordinate system, or scene coordinate system. An audio-visual object in a BIFS scene is usually represented by one BIFS node or a sub-tree of the BIFS scene graph</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-mpeg-4-model-audio-visual-object" rel="tag" title="What is the MPEG-4 model of an audio-visual object?">Read more<span class="element-invisible"> about What is the MPEG-4 model of an audio-visual object?</span></a></li></ul> Sun, 03 Feb 2013 17:18:19 +0000 leonardo 318 at https://mpeg.chiariglione.org Why is scene description information separate from audio-visual objects https://mpeg.chiariglione.org/faq/why-scene-description-information-separate-audio-visual-objects <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Scene description information is a property of the scene's structure rather than of particular AV objects. Consequently, it is transmitted as a separate stream. This is an important feature for bitstream editing and one of the essential content based functionalities in MPEG-4. For bitstream editing, one can change the composition of AV objects without having to decode their bitstreams and change their content. If the position of the object were part of the object's bitstream, this would become very difficult.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/why-scene-description-information-separate-audio-visual-objects" rel="tag" title="Why is scene description information separate from audio-visual objects">Read more<span class="element-invisible"> about Why is scene description information separate from audio-visual objects</span></a></li></ul> Sun, 03 Feb 2013 17:19:15 +0000 leonardo 319 at https://mpeg.chiariglione.org What does MPEG-4 demultiplexing mean? https://mpeg.chiariglione.org/faq/what-does-mpeg-4-demultiplexing-mean <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The demultiplexing stage retrieves the individual elementary streams that are usually interleaved for transmission or storage. It is a functionality that is not seen as part of MEPG-4 Systems. It is hidden in the delivery layer. MPEG-4 ESM just deals with the demultiplexed, usually still SL-packetized, elementary streams that are accessible through the DMIF application interface (DAI). In order to support upstream information, a receiving terminal might also incorporate multiplexing facilities.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-does-mpeg-4-demultiplexing-mean" rel="tag" title="What does MPEG-4 demultiplexing mean?">Read more<span class="element-invisible"> about What does MPEG-4 demultiplexing mean?</span></a></li></ul> Sun, 03 Feb 2013 17:01:53 +0000 leonardo 304 at https://mpeg.chiariglione.org What are the formats that are supported by MPEG-4 Visual? https://mpeg.chiariglione.org/faq/what-are-formats-are-supported-mpeg-4-visual <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: small; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">MPEG-4 Video supports :</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-formats-are-supported-mpeg-4-visual" rel="tag" title="What are the formats that are supported by MPEG-4 Visual?">Read more<span class="element-invisible"> about What are the formats that are supported by MPEG-4 Visual?</span></a></li></ul> Sun, 03 Feb 2013 17:34:01 +0000 leonardo 329 at https://mpeg.chiariglione.org What are the bitrates that are supported by MPEG-4 Visual? https://mpeg.chiariglione.org/faq/what-are-bitrates-are-supported-mpeg-4-visual <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: small; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">MPEG-4 Video is optimized for :</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-bitrates-are-supported-mpeg-4-visual" rel="tag" title="What are the bitrates that are supported by MPEG-4 Visual?">Read more<span class="element-invisible"> about What are the bitrates that are supported by MPEG-4 Visual?</span></a></li></ul> Sun, 03 Feb 2013 17:35:36 +0000 leonardo 330 at https://mpeg.chiariglione.org Where do the needs for MPEG-4 Visual come from? What are the targeted applications? https://mpeg.chiariglione.org/faq/where-do-needs-mpeg-4-visual-come-what-are-targeted-applications <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: small; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">At the beginning of the work on MPEG-4, the objective of the new standard was to address very low bitrate coding issues.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/where-do-needs-mpeg-4-visual-come-what-are-targeted-applications" rel="tag" title="Where do the needs for MPEG-4 Visual come from? What are the targeted applications?">Read more<span class="element-invisible"> about Where do the needs for MPEG-4 Visual come from? What are the targeted applications?</span></a></li></ul> Sun, 03 Feb 2013 17:37:03 +0000 leonardo 331 at https://mpeg.chiariglione.org What are exactly the functionalities that are supported by MPEG-4 Video? https://mpeg.chiariglione.org/faq/what-are-exactly-functionalities-are-supported-mpeg-4-video <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: small; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">MPEG-4 supports eight key functionalities, that can be gathered around three classes:</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-exactly-functionalities-are-supported-mpeg-4-video" rel="tag" title="What are exactly the functionalities that are supported by MPEG-4 Video?">Read more<span class="element-invisible"> about What are exactly the functionalities that are supported by MPEG-4 Video?</span></a></li></ul> Sun, 03 Feb 2013 17:38:02 +0000 leonardo 332 at https://mpeg.chiariglione.org What are the different profiles supported by MPEG-4 Video? https://mpeg.chiariglione.org/faq/what-are-different-profiles-supported-mpeg-4-video <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: small; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">The visual profiles determine which visual object types can be present in the scene. This is also the way they are defined: as a list of admissible object types.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-are-different-profiles-supported-mpeg-4-video" rel="tag" title="What are the different profiles supported by MPEG-4 Video?">Read more<span class="element-invisible"> about What are the different profiles supported by MPEG-4 Video?</span></a></li></ul> Sun, 03 Feb 2013 17:39:08 +0000 leonardo 333 at https://mpeg.chiariglione.org What is the difference between BIFS and VRML? https://mpeg.chiariglione.org/faq/what-difference-between-bifs-and-vrml <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">BIFS has been designed as an extension to the VRML 2.0 specification. In Version 2 of MPEG-4 Systems, all VRML nodes are supported.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-difference-between-bifs-and-vrml" rel="tag" title="What is the difference between BIFS and VRML?">Read more<span class="element-invisible"> about What is the difference between BIFS and VRML?</span></a></li></ul> Sun, 03 Feb 2013 17:20:50 +0000 leonardo 321 at https://mpeg.chiariglione.org The Scene Description looks similar to VRML. Is it? https://mpeg.chiariglione.org/faq/scene-description-looks-similar-vrml-it <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The scene description has several similarities to VRML, as the set of nodes defined by VRML was used as an initial set of composition nodes for MPEG-4. The environment that MPEG-4 addresses, however, is quite different from VRML because a key requirement is support for high quality real-time audio-visual content. In addition, rather than using a static scene description, MPEG-4 defines a dynamic one in which objects can be added, changed, or removed from the scene description at any point in time.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/scene-description-looks-similar-vrml-it" rel="tag" title="The Scene Description looks similar to VRML. Is it?">Read more<span class="element-invisible"> about The Scene Description looks similar to VRML. Is it?</span></a></li></ul> Sun, 03 Feb 2013 17:21:44 +0000 leonardo 322 at https://mpeg.chiariglione.org How is interactivity handled in MPEG-4? https://mpeg.chiariglione.org/faq/how-interactivity-handled-mpeg-4 <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Interactivity in MPEG-4 Systems is separated into two major categories: client side and server side. The former is available locally at an MPEG-4 terminal while the latter requires communication between the terminal and the sender. Client side interactivity can be further divided in simple object manipulation (repositioning, hiding, changing attributes, etc.) that does not require normative support from the standard, and more general types of events (hyper linking, triggers, etc.) that do require normative support. Note that server side interactivity also requires normative support.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/how-interactivity-handled-mpeg-4" rel="tag" title="How is interactivity handled in MPEG-4?">Read more<span class="element-invisible"> about How is interactivity handled in MPEG-4?</span></a></li></ul> Sun, 03 Feb 2013 17:22:31 +0000 leonardo 323 at https://mpeg.chiariglione.org Is the MPEG-4 scene description streamed? https://mpeg.chiariglione.org/faq/mpeg-4-scene-description-streamed <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Yes. There are elementary streams (just as any visual or audio stream) with BIFS commands and elementary streams conveying BIFS animation data. This allows to attach time stamps to such information, same as time stamps are attached, e.g., to an audio frame</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/mpeg-4-scene-description-streamed" rel="tag" title="Is the MPEG-4 scene description streamed?">Read more<span class="element-invisible"> about Is the MPEG-4 scene description streamed?</span></a></li></ul> Sun, 03 Feb 2013 17:23:40 +0000 leonardo 324 at https://mpeg.chiariglione.org Can there be multiple scene description streams in an MPEG-4 presentation? https://mpeg.chiariglione.org/faq/can-there-be-multiple-scene-description-streams-mpeg-4-presentation <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Yes. For example, the BIFS scene may be composed from multiple sub-scenes that are Inlined to a main scene. In that case each sub-scene would have its own scene description stream</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/can-there-be-multiple-scene-description-streams-mpeg-4-presentation" rel="tag" title="Can there be multiple scene description streams in an MPEG-4 presentation?">Read more<span class="element-invisible"> about Can there be multiple scene description streams in an MPEG-4 presentation?</span></a></li></ul> Sun, 03 Feb 2013 17:24:34 +0000 leonardo 325 at https://mpeg.chiariglione.org What is BIFS? https://mpeg.chiariglione.org/faq/what-bifs <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">BIFS is an abbreviation for "BInary Format for Scenes". BIFS provides a complete framework for the presentation engine of MPEG-4 terminals.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-bifs" rel="tag" title="What is BIFS?">Read more<span class="element-invisible"> about What is BIFS?</span></a></li></ul> Sun, 03 Feb 2013 17:15:43 +0000 leonardo 316 at https://mpeg.chiariglione.org Why BIFS? https://mpeg.chiariglione.org/faq/why-bifs <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p style="color: rgb(0, 0, 0); font-family: 'Times New Roman'; font-size: medium; font-style: normal; font-variant: normal; font-weight: normal; letter-spacing: normal; line-height: normal; orphans: 2; text-align: start; text-indent: 0px; text-transform: none; white-space: normal; widows: 2; word-spacing: 0px; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; background-color: rgb(255, 255, 255);">A central concept in the MPEG-4 design is transmission and interaction with audio-visual objects, of synthetic or natural nature.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/why-bifs" rel="tag" title="Why BIFS?">Read more<span class="element-invisible"> about Why BIFS?</span></a></li></ul> Sun, 03 Feb 2013 17:17:28 +0000 leonardo 317 at https://mpeg.chiariglione.org Can the scene description be changed https://mpeg.chiariglione.org/faq/can-scene-description-be-changed <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The scene description can be dynamically changed at any time. An initial scene description is provided at the beginning of an MPEG-4 stream. It can be as simple as a single node, or as complex as one wants (within limits that are established for ensuring conformance). BIFS-Commands are used to modify a set of properties of the scene at a given time. It is possible to insert, delete and replace nodes, fields and ROUTEs as well as to replace the entire scene.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/can-scene-description-be-changed" rel="tag" title="Can the scene description be changed">Read more<span class="element-invisible"> about Can the scene description be changed</span></a></li></ul> Sun, 03 Feb 2013 17:20:00 +0000 leonardo 320 at https://mpeg.chiariglione.org What is Part 16, Animation Framework eXtension (AFX)? https://mpeg.chiariglione.org/faq/what-part-16-animation-framework-extension-afx <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The MPEG‑4 Animation Framework eXtension (AFX) —ISO/IEC 14496‑16— contains a set of 3D tools for interactive 3D content operating at the geometry, modeling and biomechanical level and encompassing existing tools previously defined in MPEG‑4 specification.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-part-16-animation-framework-extension-afx" rel="tag" title="What is Part 16, Animation Framework eXtension (AFX)?">Read more<span class="element-invisible"> about What is Part 16, Animation Framework eXtension (AFX)?</span></a></li></ul> Sun, 03 Feb 2013 17:48:36 +0000 leonardo 336 at https://mpeg.chiariglione.org How do I declare a Digital Item? https://mpeg.chiariglione.org/about/faq/how-do-i-declare-a-digital-item <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Digital Items are declared using the Digital Item Declaration Language (DIDL). Declaring a Digital Item involves specifying its resources, metadata and their interrelationships.</p> <p>The DIDL is a language based on an XML schema that has developed to allow the declaration of Digital Items within MPEG-21. It is a flexible and general schema that provides the hooks for higher-level functionality.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/about/faq/how-do-i-declare-a-digital-item" rel="tag" title="How do I declare a Digital Item?">Read more<span class="element-invisible"> about How do I declare a Digital Item?</span></a></li></ul> Sun, 03 Feb 2013 18:09:41 +0000 leonardo 344 at https://mpeg.chiariglione.org What is a Digital Item? https://mpeg.chiariglione.org/faq/what-a-digital-item <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Digital Items are structured digital objects, including a standard representation and identification, and metadata. They are the basic unit of transaction in the MPEG-21 framework.</p> <p>Basically, a Digital Item is a combination of resources (such as videos, audio tracks, images, etc), metadata (such as MPEG-7 descriptors), and structure (describing the relationship between resources).</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/faq/what-a-digital-item" rel="tag" title="What is a Digital Item?">Read more<span class="element-invisible"> about What is a Digital Item?</span></a></li></ul> Sun, 03 Feb 2013 18:03:54 +0000 leonardo 340 at https://mpeg.chiariglione.org What does it mean to ‘declare’ a Digital Item? https://mpeg.chiariglione.org/about/faq/what-does-it-mean-%E2%80%98declare%E2%80%99-a-digital-item <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>‘Declaring’ a Digital Item is the process of defining the structure of the Digital Item. It enables creators of Digital Items to declare what their Items comprise of and in what relation these components stand to each other.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/about/faq/what-does-it-mean-%E2%80%98declare%E2%80%99-a-digital-item" rel="tag" title="What does it mean to ‘declare’ a Digital Item?">Read more<span class="element-invisible"> about What does it mean to ‘declare’ a Digital Item?</span></a></li></ul> Sun, 03 Feb 2013 18:07:13 +0000 leonardo 342 at https://mpeg.chiariglione.org Can you give an example of a Digital Item? https://mpeg.chiariglione.org/about/faq/can-you-give-example-a-digital-item <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>One very simple example of a Digital Item would be a ‘digital music release’: It could comprise two sound recordings (e.g. sound files coded in MPEG-2 AAC), the cover photograph (using JPEG), a video clip, and a text file containing the lyrics for the songs. The Digital Item could also contain metadata describing the content of the Item.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/about/faq/can-you-give-example-a-digital-item" rel="tag" title="Can you give an example of a Digital Item?">Read more<span class="element-invisible"> about Can you give an example of a Digital Item?</span></a></li></ul> Sun, 03 Feb 2013 18:08:32 +0000 leonardo 343 at https://mpeg.chiariglione.org I heard that a Second Edition of MPEG-2 Audio has been approved. What are the reasons behind this revision? https://mpeg.chiariglione.org/content/i-heard-a-second-edition-mpeg-2-audio-has-been-approved-what-are-reasons-behind-revision <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>While implementing the MPEG-2 Audio standard, as published in 1995, it was discovered that a certain combination of functionalities could not function properly. Although this combination was not considered to be of great practical importance, it was felt necessary to correct the standard in this respect. Since this necessitated a revision of the document, the opportunity was then taken to improve the standard in some other fields as well.</p> <p>The technical changes in the Second Edition compared to the first publication of ISO/IEC 13818-3 (1995) are :</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/i-heard-a-second-edition-mpeg-2-audio-has-been-approved-what-are-reasons-behind-revision" rel="tag" title="I heard that a Second Edition of MPEG-2 Audio has been approved. What are the reasons behind this revision?">Read more<span class="element-invisible"> about I heard that a Second Edition of MPEG-2 Audio has been approved. What are the reasons behind this revision?</span></a></li></ul> Sun, 03 Feb 2013 16:07:55 +0000 leonardo 296 at https://mpeg.chiariglione.org What are the impacts of the technical changes in the revision to the Technical Report and Conformance documents? https://mpeg.chiariglione.org/content/what-are-impacts-technical-changes-revision-technical-report-and-conformance-documents <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>There are no impacts on the Conformance document. There is only a minor impact on the Technical Report: one possible embodiment of a lowpass filter was implemented in the Technical Report. This filter has to be removed and the dematrix operations adapted. An amendment to the Technical Report is now being prepared.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/what-are-impacts-technical-changes-revision-technical-report-and-conformance-documents" rel="tag" title="What are the impacts of the technical changes in the revision to the Technical Report and Conformance documents?">Read more<span class="element-invisible"> about What are the impacts of the technical changes in the revision to the Technical Report and Conformance documents?</span></a></li></ul> Sun, 03 Feb 2013 16:43:52 +0000 leonardo 297 at https://mpeg.chiariglione.org When should I use AAC rather than MPEG-2 BC? https://mpeg.chiariglione.org/content/when-should-i-use-aac-rather-mpeg-2-bc <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Both provide 5-channel audio coding capability, however AAC provides a factor of two better audio compression relative to MPEG-2 BC, and is appropriate in all situations in which backward compatibility is not required or can be accomplished with simulcast. An MPEG-1 two channel decoder can decode an MPEG-2 BC 5-channel bitstream. AAC has no such backward compatibility requirement and, for 5-channel audio signals, has been shown in MPEG formal listening tests to provide slightly better audio quality at 320 kb/s than MPEG-2 BC can provide at 640 kb/s.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/when-should-i-use-aac-rather-mpeg-2-bc" rel="tag" title="When should I use AAC rather than MPEG-2 BC?">Read more<span class="element-invisible"> about When should I use AAC rather than MPEG-2 BC?</span></a></li></ul> Sun, 03 Feb 2013 16:49:43 +0000 leonardo 300 at https://mpeg.chiariglione.org What kind of support does MPEG provide for implementers of MPEG Audio? https://mpeg.chiariglione.org/content/what-kind-support-does-mpeg-provide-implementers-mpeg-audio <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>MPEG provides different kinds of support to implementers. Firstly, a Technical Report is issued that contains software that describes the decoder and an example encoder. This software can be used by implementers to analyze and to get accustomed with the algorithms, and could be used as a basis for an implementation. This encoder can be used to generate test sequences. The Technical Report is published as part 5 of the standard, i.e. ISO/IEC 11172-5 for MPEG-1 and ISO/IEC 13818-5 for MPEG-2, ISO/IEC 13818-5 for MPEG-2 AAC.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/what-kind-support-does-mpeg-provide-implementers-mpeg-audio" rel="tag" title="What kind of support does MPEG provide for implementers of MPEG Audio?">Read more<span class="element-invisible"> about What kind of support does MPEG provide for implementers of MPEG Audio?</span></a></li></ul> Sun, 03 Feb 2013 16:56:59 +0000 leonardo 303 at https://mpeg.chiariglione.org Is there any reference software for MPEG-2 AAC? https://mpeg.chiariglione.org/content/there-any-reference-software-mpeg-2-aac <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Yes. There is reference software for both an AAC example encoder and reference decoder. The decoder source is complete and fully compliant and is capable of decoding all three AAC profiles: Main, Low Complexity and Scaleable Sampling Rate. It is a general multi-channel decoder capable of decoding up to 48 audio channels, 15 auxiliary low frequency enhancement channels and 15 data streams. Furthermore, it is quite efficient in that the compiled reference source coder decodes a stereo bitstream in real-time on a 100 MHz Pentinum.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/there-any-reference-software-mpeg-2-aac" rel="tag" title="Is there any reference software for MPEG-2 AAC?">Read more<span class="element-invisible"> about Is there any reference software for MPEG-2 AAC?</span></a></li></ul> Sun, 03 Feb 2013 16:53:14 +0000 leonardo 302 at https://mpeg.chiariglione.org What is MPEG-2 AAC? https://mpeg.chiariglione.org/content/what-mpeg-2-aac <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>The MPEG-2 AAC standard is a new, state of the art audio standard that provides very high audio quality at a rate of 64 kb/s/channel for multichannel operation. It provides a capability of up to 48 main audio channels, 16 low frequency effects channels, 16 overdub/multilingual channels, and 16 data streams. Up to 16 programs can be described, each consisting of any number of the audio and data elements. There are three profiles for the AAC standard, called Main Profile, Low Complexity Profile, and Scalable Sampling Rate Profile.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/what-mpeg-2-aac" rel="tag" title="What is MPEG-2 AAC?">Read more<span class="element-invisible"> about What is MPEG-2 AAC?</span></a></li></ul> Sun, 03 Feb 2013 16:45:38 +0000 leonardo 298 at https://mpeg.chiariglione.org Why should I use MPEG-2 AAC rather than Dolby AC-3? https://mpeg.chiariglione.org/content/why-should-i-use-mpeg-2-aac-rather-dolby-ac-3 <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>AAC is a state-of-the-art audio compression algorithm that provides compression superior to that provided by older algorithms such as AC-3. AAC and AC-3 are both transform coders, but AAC uses a filterbank with a finer frequency resolution that enables superior signal compression. AAC also uses a number of new tools such as temporal noise shaping, backward adaptive linear prediction, joint stereo coding techniques and Huffman coding of quantized components, each of which provide additional audio compression capability.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/why-should-i-use-mpeg-2-aac-rather-dolby-ac-3" rel="tag" title="Why should I use MPEG-2 AAC rather than Dolby AC-3?">Read more<span class="element-invisible"> about Why should I use MPEG-2 AAC rather than Dolby AC-3?</span></a></li></ul> Sun, 03 Feb 2013 16:47:08 +0000 leonardo 299 at https://mpeg.chiariglione.org When should I use AAC rather than MPEG-2 BC? https://mpeg.chiariglione.org/content/when-should-i-use-aac-rather-mpeg-2-bc <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Both provide 5-channel audio coding capability, however AAC provides a factor of two better audio compression relative to MPEG-2 BC, and is appropriate in all situations in which backward compatibility is not required or can be accomplished with simulcast. An MPEG-1 two channel decoder can decode an MPEG-2 BC 5-channel bitstream. AAC has no such backward compatibility requirement and, for 5-channel audio signals, has been shown in MPEG formal listening tests to provide slightly better audio quality at 320 kb/s than MPEG-2 BC can provide at 640 kb/s.</p> </div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/when-should-i-use-aac-rather-mpeg-2-bc" rel="tag" title="When should I use AAC rather than MPEG-2 BC?">Read more<span class="element-invisible"> about When should I use AAC rather than MPEG-2 BC?</span></a></li></ul> Sun, 03 Feb 2013 16:49:43 +0000 leonardo 300 at https://mpeg.chiariglione.org Is stream splicing or "break-in" supported in MPEG-2 AAC? https://mpeg.chiariglione.org/content/stream-splicing-or-break-supported-mpeg-2-aac <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>In MPEG-2 AAC Low Complexity and MPEG-2 AAC SSR modes, the prediction tools are not used, so break-in support is the same as that for MPEG-1 audio. For MPEG-2 AAC Main Profile, when prediction is enabled, break-ins are a little tricker, as break-ins can only occur when there is a predictor reset across all frequency bands. This only happens in case of æattacksÆ when the bitstream switches from long to short windows, so the easiest way to break in a main profile bitstream is to start with a short block.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/stream-splicing-or-break-supported-mpeg-2-aac" rel="tag" title="Is stream splicing or &quot;break-in&quot; supported in MPEG-2 AAC?">Read more<span class="element-invisible"> about Is stream splicing or &quot;break-in&quot; supported in MPEG-2 AAC?</span></a></li></ul> Sun, 03 Feb 2013 16:50:55 +0000 leonardo 301 at https://mpeg.chiariglione.org Is there any reference software for MPEG-2 AAC? https://mpeg.chiariglione.org/content/there-any-reference-software-mpeg-2-aac <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Yes. There is reference software for both an AAC example encoder and reference decoder. The decoder source is complete and fully compliant and is capable of decoding all three AAC profiles: Main, Low Complexity and Scaleable Sampling Rate. It is a general multi-channel decoder capable of decoding up to 48 audio channels, 15 auxiliary low frequency enhancement channels and 15 data streams. Furthermore, it is quite efficient in that the compiled reference source coder decodes a stereo bitstream in real-time on a 100 MHz Pentinum.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/there-any-reference-software-mpeg-2-aac" rel="tag" title="Is there any reference software for MPEG-2 AAC?">Read more<span class="element-invisible"> about Is there any reference software for MPEG-2 AAC?</span></a></li></ul> Sun, 03 Feb 2013 16:53:14 +0000 leonardo 302 at https://mpeg.chiariglione.org What kind of support does MPEG provide for implementers of MPEG Audio? https://mpeg.chiariglione.org/content/what-kind-support-does-mpeg-provide-implementers-mpeg-audio <div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>MPEG provides different kinds of support to implementers. Firstly, a Technical Report is issued that contains software that describes the decoder and an example encoder. This software can be used by implementers to analyze and to get accustomed with the algorithms, and could be used as a basis for an implementation. This encoder can be used to generate test sequences. The Technical Report is published as part 5 of the standard, i.e. ISO/IEC 11172-5 for MPEG-1 and ISO/IEC 13818-5 for MPEG-2, ISO/IEC 13818-5 for MPEG-2 AAC.</p></div></div></div><ul class="links inline"><li class="node-readmore first last"><a href="/content/what-kind-support-does-mpeg-provide-implementers-mpeg-audio" rel="tag" title="What kind of support does MPEG provide for implementers of MPEG Audio?">Read more<span class="element-invisible"> about What kind of support does MPEG provide for implementers of MPEG Audio?</span></a></li></ul> Sun, 03 Feb 2013 16:56:59 +0000 leonardo 303 at https://mpeg.chiariglione.org