Introduction to MPEG-4

Size: px
Start display at page:

Download "Introduction to MPEG-4"

Transcription

1 5255ch01.qxd_sm 11/29/01 9:22 AM Page 1 CHAPTER 1 Introduction to MPEG-4 OLIVIER AVARO TOPICS IN THIS CHAPTER THE MPEG BRAIN AND ITS ARTIFACTS MPEG-4 IN A NUTSHELL ARCHITECTURE AND TOOLS NEXT GENERATION OF PORTALS INTERACTIVE BROADCAST MULTIMEDIA CONFERENCING AND COMMUNITIES T his chapter provides a brief introduction to MPEG, an overview of the MPEG-4 standard, and a more detailed description of the architecture and tools of the MPEG-4 specification. The first topic casts a meme s eye on the MPEG organization and likens the MPEG body to a brain that produces standards as artifacts. The second topic provides a high-level view of the MPEG-4 standard by putting it in context and describing the fundamental principles behind the specification. The last topic goes one step deeper in the architecture of an MPEG-4 browser, thereby introducing the remaining chapters of this book. The MPEG Brain and Its Artifacts This section adopts a meme s point of view in describing MPEG, expecting that it will provide the reader with an interesting and accurate representation of what MPEG standards are and how they are produced. Formally, the Moving Picture Experts Group (MPEG) is a working group of ISO/IEC in charge of the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio, and their combination. Another

2 5255ch01.qxd_sm 11/29/01 9:22 AM Page 2 2 MPEG-4 Jump-Start way of defining MPEG is to take a meme s eye and see MPEG as a huge brain at the service of the memes that have a digital representation: the digital memes. MPEG and Memes Memes were discovered by Richard Dawkins, and the term first appears in The Selfish Gene [1]. The Oxford English Dictionary, quoted by Susan Blackmore [2], now provides the following definition of meme: Meme: An element of a culture that may be considered to be passed on by nongenetic means, especially imitation. Further in her book, Blackmore expands the term to refer to memetic information in any of its many forms; including ideas, the brain structures that instantiate these ideas, the behaviors these brain structures produce, and their versions in books, recipes, maps and written music. As long as the information can be copied by a process we may call imitation, then it counts as a meme. Keeping the meme s definition in mind, one can make the following two remarks with regard to MPEG: 1. Some memes are transferred in audiovisual data. By providing a representation of this data that improves their quality and facilitates their replication, storage, and their broad diffusion, MPEG specifications are serving these so-called audiovisual memes. 2. Following the meme s point of view, the MPEG organization can therefore be seen as a structure at the service of the audiovisual memes. Although these two remarks could be applied to most of the standardization body providing representation of digital data, MPEG, by the way it works and the nature of its activity, is a particularly good example, as we will see in the next sections. The MPEG Brain MPEG specifications themselves carry memes, but they are not the same as the memes carried in the digital representation of the media. They are contained in the description of the processes for decoding and reconstructing the audiovisual media. A technological meme s chances of survival are given a real boost if it can enter a standard specification: It is replicated in all terminals compliant to the standard. It is assured a long lifetime, and it is exposed to a low risk of mutation since it is maintained by the ISO.

3 5255ch01.qxd_sm 11/29/01 9:22 AM Page 3 1 Introduction to MPEG-4 3 It stays together with other compelling memes, which makes them more attractive and forms with them what is called a memeplex. One can therefore expect big fights during the standardization process: What is at stake for the memes is their replication or their death. The fighting process occurs during MPEG meetings. MPEG usually holds three meetings a year. These comprise plenary meetings where decisions are made and subgroup meetings where technical discussions occur. MPEG meetings are attended by some 300 experts from over 20 countries [3]. The selection pressure on the memes is increased since some of the memes are patented: Not only is the survival of the memes at stake, but also the interests of the companies that exert a strong financial pressure to have their intellectual property included in the standard. MPEG, Memes, and Patents Why is MPEG allowing patented memes in the standard? This seems to go against the broad diffusion of the technology. The penalty of being patented can be compensated for by having active company supporters. Therefore, patented memes also have their chances since additional efforts are being made to promote them. In addition, having a memeplex with patented memes is beneficial to other nonpatented memes, since they benefit from the promotional efforts made by the supporters of the patented technology. Therefore, patented memes may be damaging only to companies that have not been participating in the MPEG process or have no intellectual property in the standard. To avoid this, MPEG patent pools ensure that MPEG memes are made available to everyone at reasonable costs. These costs generally do not cover much more than the R&D investments and the patent pool logistic. The fighting process is similar to what happens in our brains, where memes for example, music, jokes, and poetry are fighting to be remembered and replicated. MPEG can therefore be likened to a huge brain, processing the memes contained in input documents, operating a selection on these memes, and outputting a pool of memes in the MPEG specifications. In such a competitive environment, memes use all the tricks they can to enter the specification in order to survive. The selection of the memes in MPEG is done according to a set of design rules and a process named core experiment. The set of design rules includes

4 5255ch01.qxd_sm 11/29/01 9:22 AM Page 4 4 MPEG-4 Jump-Start A tool-based approach: The MPEG memeplex is organized into small entities called tools that can be used independently. This approach allows the use of only part of the standard while guaranteeing that all the tools can work together. It therefore allows the creation of a memeplex momentum without putting any replicating burden on the individual memes. Specification of the minimum: Only the tools that are needed to allow replication of the audiovisual memes (e.g., allow interoperability of the audiovisual data) are specified. In particular, only the decoding process is specified in MPEG. One tool, one functionality: Options are a compromise between competing memes that permit both to enter the standard: You let me in, I let you in. Still, they put an extra burden on the specification and compromise interoperability. MPEG has a strict rule to avoid duplication of functionality, and competing technologies are weeded out through a decision process illustrated in Figure 1 1. The core experiment process consists of the following steps: 1. At a given MPEG meeting, the specification is frozen and described in a document called a verification model. Mutations of this specification are defined and are called core experiments. A core experiment defines the evaluation process of a given tool: data tests, description of the technology, and evaluation criteria. 2. Between meetings, the core experiment is carried out. In general, results are exchanged on mailing lists dedicated to that purpose. All the various mutations of the verification model are carried out in parallel and reported back to MPEG. 3. At the next MPEG meeting, the results of the experiments are gathered and decisions are made on the way the specification should evolve. Finally, an important aspect of the MPEG process is that the MPEG memeplex is not documented only in plain English that is, in a language only understandable by some human brains, with all its richness and approximation. MPEG also requires that the specification be written in a language understandable by virtu- Mutation 1 Verification Model Version N Mutation 1 Mutation N Verification Model Version N + 1 FIGURE 1 1 The selection of the memes in MPEG is made according to a set of design rules and a process named core experiment.

5 5255ch01.qxd_sm 11/29/01 9:22 AM Page 5 1 Introduction to MPEG-4 5 ally any programmable device (e.g., in C, C++, or Java). This requirement not only validates the specification one step further, it considerably improves the reproducibility and fidelity of the memes contained in the standard. MPEG Standards The MPEG brain provides for a selective environment for technological memes. After a strict evaluation, only a few of them are documented in textual and software standards, the artifacts of the MPEG activity. Even though some technological memes, such as those describing a decoding process, for example, get promoted through the MPEG process and therefore benefit from it, it is really the audiovisual memes that benefit most from the MPEG structure, as we will see from the existing MPEG standards and the ones under development. So far, MPEG has produced MPEG-1, the standard for storage and retrieval of moving pictures and audio on storage media (approved November 1992), MPEG-2, the standard for digital television (approved November 1994), and MPEG-4, the standard for multimedia applications (approved October 1998). MPEG is now developing MPEG-7, the content representation standard for multimedia information search, filtering, management, and processing (approved September 2001), and MPEG-21, the multimedia framework. MPEG-1 and MPEG-2 have provided the foundation for the digital representation of the audiovisual memes. Thanks to these standards, and in conjunction with the proliferation of networks to transport the encoded information as well as of terminals to display the content, audiovisual memes are ubiquitous. The main advantage of MPEG-2 over MPEG-1, from a meme s eye, is the possibility of having access to more human brains through TV broadcasting networks and terminals as well as through multilingual capabilities. MPEG-4 extends the MPEG-1 and MPEG-2 models in three directions: Object-based representation: MPEG-4 models audiovisual memes with a rich set of representations called audiovisual objects. They include high-level mathematical models such as curves and surfaces, not just compressed audiovisual samples. From a meme s eye, this means a richer palette of digital representations. Interactive audiovisual scenes: Audiovisual objects can be grouped together in more complex structures called audiovisual scenes, to which a programmatic behavior can be attached. Reproducibility, as well as reusability of the memes, is improved since the data have an inherent structure. As the audiovisual content is more attractive, the impact of the memes on human brains is bigger. Network abstraction: With DMIF (Delivery Multimedia Integration Framework), MPEG-4 provides an abstraction of the network so that MPEG-4

6 5255ch01.qxd_sm 11/29/01 9:22 AM Page 6 6 MPEG-4 Jump-Start data can be authored once and carried on any transport mechanism with minimal adaptation efforts. With DMIF and the object-based scalability provided by MPEG-4, audiovisual memes can reach virtually any connected brains. MPEG-7 is another breakthrough for audiovisual memes: MPEG-7 s XML-based representation of the audiovisual data can reach the brains of the computers. MPEG-7 descriptions allow both low-level (e.g., color histograms) and high-level (e.g., data structure) reasoning on the audiovisual content. With these tools, audiovisual memes can be more easily searched, filtered, and retrieved by automated agents. Finally, with MPEG-21, the multimedia framework, MPEG is developing a complete ecosystem for richer and more-structured digital memes, not restricted only to audiovisual ones. MPEG-21 will allow these digital memes to circulate as freely as possible, according to the usage rights that human agents have attached to them. Summarizing the meme s point of view of MPEG, we have seen that MPEG standards are evolving from MPEG-1 to MPEG-21 in such a way that the efficiency of the representation and replication of memes in human brains as well as in computer brains is constantly improved. We have also seen that the MPEG standard development process optimizes the memeplex formed by the standardized technology so that it best serves audiovisual memes. The MPEG standardization body can therefore be seen as a structure working for the replication and diffusion of memes in general and audiovisual memes in particular. One may claim that this is not its sole purpose. Indeed, a rich economy is created around MPEG standards and could be seen as the primary goal of the standardization. Still, tastes and technologies change over time; sometimes quite rapidly. What will remain in any case are the digital representations of the memes, as if this were the only raison d être of MPEG, in the same way that what remains from ancient cultures are the artifacts that have endured through time. MPEG-4 in a Nutshell This section provides a high-level overview of the MPEG-4 standard. Here, we discuss the design goals and principles behind MPEG-4, navigation in MPEG-4, and MPEG-4 as it relates to other multimedia standards. MPEG-4 Design Goals and Principles The MPEG-4 project began in 1993 with the initial goal of very low bit rate audiovisual coding. The initial participants were consumer electronics companies, the computer industry, and telecom operators, as well as academia. Early in the effort it became clear that this goal would only be reached if major

7 5255ch01.qxd_sm 11/29/01 9:22 AM Page 7 1 Introduction to MPEG-4 7 changes in the MPEG paradigm were adopted, since no major breakthrough was expected in the compression area. This change of paradigm was also supported by a major evolution in the way that audiovisual content would be produced, delivered, and consumed in the coming years, as summarized in Table 1 1. In 1994, the goal of the MPEG-4 standard was therefore changed to coding of audiovisual objects to more accurately reflect what was now the focus of the work. In 2001, one can say that the MPEG-4 standard is finalized, even though some additional tools are still under development for advanced functionality. These ongoing extensions are described in More MPEG-4 Jump-Start. The main design goals of MPEG-4 can be summarized as follows : To provide a corpus of technology to be used by various types of multimedia services and networks including interactive, broadcast, and conversational models. It was requested that the audiovisual content should flow seamlessly among these different types of services. To improve the user experience and provide audiovisual content with the same kind of interactivity that can be found on the World Wide Web. This implies client-side as well as client-server interactivity. To integrate rich media content in a unique framework so that it can be seamlessly manipulated by the content authors as well as by the end users. Such rich media include both natural and synthetic content. TABLE 1 1 Evolution of the Production, Delivery, and Consumption Paradigm. Traditional Way MPEG-4 Way Production Mostly 2D content 2D/3D, computer-generated, produced with cameras presegmented material, hybrid and microphones. natural/synthetic AV coding. Delivery Few networks carry AV New networks carry AV information information (satellite, (PSTN, wireless network, Internet). LAN, ISDN). Homo- Multiple network types communigeneous network cation (heterogeneous networks). communications. Consumption Passive consumption. Information is read, seen, and heard Few media types on in an interactive way (interaction low bandwidth network with objects in the content, not (mostly text). only frames), more and more information is audiovisual.

8 5255ch01.qxd_sm 11/29/01 9:22 AM Page 8 8 MPEG-4 Jump-Start IPMP: MPEG-4 Intellectual Property Management and Protection How does MPEG-4 tackle the problem of pirated audiovisual content? An important design goal of the MPEG-4 standard is to allow consumption of the audiovisual content while respecting the usage rights that are attached to it. MPEG-4 has currently standardized a framework, called the MPEG-4 hooks, that protects audiovisual content. The hooks allow the identification of the system used to protect the content, the so-called IPMP system. The IPMP system itself is not specified by MPEG. MPEG-4 is now standardizing an extension of these MPEG-4 hooks, which will provide more interoperability for protected content as well as more flexibility for the protecting system. This extension will take into account the requirement of the end users who do not want to have thousands of devices to consume its content. It will also take into account the requirement of renewability, allowing IPMP systems to be more robust to withstand attacks from pirates. Finally, it is interesting to note that existing content can be given new life by the MPEG-4 rich media framework, the same way existing movies are enhanced with interactive features and games in the DVD industry. Following these design goals, the MPEG-4 standard has been developed on a basic and simple principle: the concept of audiovisual scenes made of audiovisual objects composed together according to a scene description. This concept of audiovisual scene allows Interaction with elements within the audiovisual content, named audiovisual objects. Adaptation of the coding scheme on a per-audiovisual object basis. Easy reuse and customization of audiovisual content. Audiovisual objects can be of a different nature. They can be purely audio objects, such as single- or multichannel audio content, or purely video, such as a traditional rectangular movie or a more exotic, arbitrarily shaped video object. Objects can be natural, like audiovisual data captured from a microphone or from a camera, or synthetic, like text and graphics overlays, animated faces, or synthetic music. They can be 2D like a Web page, or 3D like a spatialized sound or a 3D virtual world. The scene description provides the spatial and temporal relationship between the audiovisual objects. These relationships can be purely 2D or 3D but can also be a mixture of 2D and 3D scene description. The behavior and interactivity of the audiovisual objects and scenes are also specified in the scene description. In

9 5255ch01.qxd_sm 11/29/01 9:22 AM Page 9 1 Introduction to MPEG-4 9 addition, MPEG-4 cites specific protocols to modify and animate the scene description in time, thereby providing incremental build-up, modification, and animation of the audiovisual content. Also, it is important to note that all this information is provided as compressed binary streams that can be synchronized. A typical audiovisual scene is shown in Figure 1 2. Navigation in MPEG-4 The MPEG-4 standard specification is a fairly complex set of documents. This book provides content authors with most of the information they need to develop MPEG-4 content, as far as the scene description is concerned. It also provides the architectural elements needed to understand how media streams and Interactive Scene Description Scene Description Stream Object Description Stream Visual Stream Visual Stream Visual Stream Audio Stream... FIGURE 1 2 Various video streams and audio signals are composed on top of a fixed-background still picture according to a scene description. An additional concept is the concept of object descriptor. These tiny structures provide the links between the scene description and the streams of the audiovisual objects.

10 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start other concepts like IPMP fit in the big picture. Still, readers of this book may need occasional direct access to the MPEG-4 specification. This section gives some quick navigation advice with this in mind. MPEG-4 is a standard in several parts grouped under a specification numbered by ISO. The main parts are illustrated in Figure 1 3 and described below Systems [4] and DMIF [9]: These parts of the standard encompass the interactive scene description as well as the specification of the tools needed for the synchronization of the audiovisual content and its carriage on various networks Audio [5]: This part of the standard contains all the representation of audio objects, either natural or synthetic Visual [6]: This part of the standard contains all the representation of visual objects, either natural or synthetic. In addition, MPEG-4 contains Conformance [7]: This specification describes how compliance to the various parts of the standard can be tested. It contains, in particular, audio, visual, and systems test streams Reference Software [8]: This document contains a complete software implementation of the MPEG-4 specification, available for any commercial applications compliant with the standard. Audiovisual Scene Coded Representation Natural and Synthetic Audio Information Coded Representation Natural and Synthetic Visual Information Coded Representation Carriage and Synchronization of Audiovisual Information / FIGURE 1 3 The main parts of the MPEG-4 standard structure.

11 5255ch01.qxd_sm 11/29/01 9:22 AM Page 11 1 Introduction to MPEG-4 11 Versions, Amendments, Corrigenda, and Extensions The technologies considered for standardization in MPEG-4 were not all at the same level of maturity. Therefore, the development of the standard was organized in several phases, called versions. New versions complete the current standardized toolbox with new tools and new functionality. They do not replace the tools of the previous versions. There are currently five versions of MPEG-4 Systems. In ISO language, versions are called amendments. Sometimes, errors are found in the specification. These errors are gathered in documents named corrigenda that are published as needed. Currently, a corrigendum has been finalized for MPEG-4 Systems. A new one is under development. Periodically, ISO publishes a new edition of the standard that gathers all amendments and corrigenda done since the last edition. The current edition of MPEG-4 Systems is :2001. It contains MPEG-4 Systems Version 1, Version 2, and Corrigendum 1. Finally, because the numbering of amendments restarts each time a new edition is made, the link between version numbering and amendment numbering was difficult to maintain. MPEG has therefore defined a new name, called extensions, for the successive additions made to the standard. The extension numbering does not wrap around and is therefore easier to follow. The focus of this book is the scene description and the representation of the audio and visual synthetic objects, as seen from an authoring perspective. This information is spread throughout the first three parts of the standard, and it is, therefore, a bit difficult to find. Generally, all the information related to the structure of the scene description and its animation as well as to graphical objects that do not have streams attached to them can be found in the systems part of the standard. Specific synthetic audio and visual objects that do have a specific streamed representation can be found in the audio and visual parts of the standard. MPEG-4 and Other Multimedia Standards Prior to the development of the MPEG-4 standard, other multimedia standards and solutions were in place. MPEG-4 has extensively used and referred to these ancestor technologies. The main ones are MPEG-2 [11] and H.323 [14]: MPEG-4 used the media representation developed by these standards to construct the MPEG-4 data formats. At the systems level, MPEG-4 is backward compatible with MPEG-1 and MPEG-2

12 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start audio and visual streams. MPEG-4 Systems use MPEG-2 transport mechanisms to carry MPEG-4 data. A simple version of MPEG-4 Video is backward compatible with H.263. VRML97 [10]: MPEG-4 based its scene description on VRML97 and provided additional functionality: the integration of streams, 2D capabilities, integration of 2D and 3D, advanced audio features, timing model, update and animation protocols to modify the scene in time, and compression efficiency with the BIFS (Binary Format for Scenes). Background on this key specification is provided in Chapter 2, Virtual Reality Modeling Language (VRML) Overview. QuickTime [20]: Several file formats for storing, streaming, and authoring multimedia content were available. Among those most used at present are Microsoft ASF, Apple QuickTime, and RealNetworks file format (RNFF). The ASF and QuickTime formats were proposed to MPEG-4 in response to a call for proposals on file format technology. QuickTime was selected as the basis for the MPEG-4 file format (referred to as MP4 ). During the development of the MPEG-4 standard, other communities have developed tools that have been used by the MPEG-4 standard. Among these tools, we can find Java [16][17] technology: MPEG-4 offers a programmatic environment, MPEG-J, that seeks to extend the content creators ability to incorporate complex controls and data processing mechanisms along with the BIFS scene representations and elementary media data. The MPEG-J environment intends to enhance the end user s ability to interact with the content. XML [12]: MPEG-4 offers an XML-based representation of the scene description, called XMT (extensible MPEG-4 Textual format). XMT comes in two flavors: a low-level representation that exactly mirrors the BIFS representation and a high-level representation that is closer to the author s intent and that can be mapped on the low-level format. This format is fully compatible with the one currently under development by the Web3D Consortium, X3D[18]. In parallel to the development of the MPEG-4 specifications, other standardization bodies and industry consortia have developed tools and applications that address some of the MPEG-4 objectives. Concerning proprietary format, technical issues aside, the mere fact of being closed is a significant disadvantage in the content industry when open alternatives exist. With the separation of content production, delivery, and consumption stages in the multimedia pipeline, the

13 5255ch01.qxd_sm 11/29/01 9:22 AM Page 13 1 Introduction to MPEG-4 13 MPEG-4 standard will enable different companies to separately develop authoring tools, servers, or players, thus opening up the market to independent product offerings. This competition will probably allow rapid proliferation of content and tools that will interoperate. Several technologies can be seen as competitors of MPEG-4. The more relevant ones are SMIL [13]: The Synchronized Multimedia Integration Language is an XMLcompliant specification for 2D multimedia scene descriptions developed by the W3C SMIL working group. SVG [19]: The scalable vector graphics format is an XML-compliant specification also developed by W3C. As depicted in Figure 1 4, the XMT format facilitates interoperability with the X3D, SMIL, and SVG specifications. One of the design goals of XMT has been to maximize the overlap with SMIL, SVG, and X3D. Therefore, content authors are now able to compile in MPEG-4 the content they have already produced in these formats, given explicit authoring constraints. Not all SMIL and SVG content is supported by XMT, since some functionality was already defined in MPEG-4. Replicating the tools would have put an extra complexity on the MPEG-4 standard. In addition, MPEG-4 supports features that are not supported by these formats. For example, MPEG-4 is built on a true 2D and 3D scene description, including the event model, as extended from VRML. None of the currently available MPEG-4 competitors reach the sophistication of MPEG-4 in terms of composition capabilities and interactivity features. SMIL SVG XMT Parse SMIL Player X3D Player MPEG-7 X3D Compile to MPEG-4 MPEG-4 Browser FIGURE 1 4 One of the design goals of XMT has been to maximize the overlap with SMIL, SVG, and X3D.

14 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start Furthermore, incorporation of the temporal component in streamed scene descriptions is not a trivial matter. MPEG-4 has successfully addressed this issue, as well as the overall timing and synchronization issues, whereas alternative approaches are lacking in this respect. Finally, MPEG-4 is the fruit of a multiyear collaborative effort on an international scale aimed at the definition of an integrated framework. The fragmented nature of the development of competing specifications by different bodies and industries certainly hinders integrated solutions. This may also cause a distorted vision of the integrated, targeted system as well as duplication of functionality. There is no evidence that real integration can be achieved by any alternative frameworks. Architecture and Tools This section first discusses the MPEG-4 end-to-end architecture and then delves deeper into the anatomy of an MPEG-4 browser. End-to-End Architecture The MPEG-4 standard is designed to be used in many environments and terminals. Still, the way MPEG-4 is produced, delivered, and consumed always follows the same walk-through, as depicted in Figure 1 5. This architecture highlights the main tools of the MPEG-4 standard as well as their position in the end-to-end design. At the beginning of the walk-through are the content authors. They produce audiovisual content with the tools they have available. Part of the content creation process may be live or automated. The content creation process can be separated into two steps: authoring and publishing. Authoring is related to the production of the audiovisual data, including the scene description and interaction. Publishing is related to the adaptation of the content to the constraints put by, for example, the networks on which it will be carried or the terminals at which it will be consumed. The content is delivered to MPEG-4 servers in XMT format, as described in Chapter 11, Extensible MPEG-4 Textual Format (XML), or in MP4 file format, described in More MPEG-4 Jump-Start. The choice between these two formats depends on the freedom the authors want at the next stage of the delivery chain. XMT provides a lot of flexibility in adapting the content to further constraints. In addition, XMT may contain additional information that makes it an appropriate format for exchange between content authors. MP4 is more rigid in that sense but also more deterministic with regard to what the users will see. This may be what the author likes if he wants to somehow ensure that his content is not further manipulated.

15 5255ch01.qxd_sm 11/29/01 9:22 AM Page 15 1 Introduction to MPEG-4 15 XMT/MP4 MPEG-4 Servers Content Creation and Publishing Internet/Mobile Networks MPEG-2 Systems Interactive Television Set Mobile Terminals Computers FIGURE 1 5 This end-to-end architecture highlights the main tools of the MPEG-4 standard as well as their position in the end-to-end design. MPEG-4 servers use MP4 files to serve the content on various networks. Although MP4 files are the natural interoperability points between the content authors and the MPEG-4 servers, this does not mean that the content will be stored as MP4 files at the server side. Indeed, this is left to the server implementation that may have other ways to represent the content, more optimized for its own software and hardware. What goes out of the server are streams of data containing MPEG-4 content, called elementary streams. The content of these elementary streams is discussed later in this chapter. What is important at this stage is that MPEG-4 audiovisual scenes can be split into several elementary streams, that these streams can be

16 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start DMIF and Carriage of MPEG-4 Content One of the design goals of MPEG-4 is to cover a wide range of access conditions so that content can be created once and played on any network. This goal is achieved by abstracting the content delivery layer with an interface named DAI (DMIF Application Interface), as defined in [9]. At the MPEG-4 level, the interoperability points are therefore the format of the elementary streams and the compliance to the walk-through defined by the DAI. What happens below the DAI is, in principle, outside the scope of the MPEG-4 standard. Still, in some cases, because MPEG-4 needed specific tools for its transport, like an efficient, low-complexity multiplexing tool (FlexMux) and a dedicated file format (MP4), these specifications have been developed within the MPEG-4 standard. In order that the MPEG-4 content can actually be transported in existing environments, network-specific transport mechanisms have been defined. Currently the following transport mechanisms are available: Carriage of MPEG-4 content on the Internet [21]. Carriage of MPEG-4 content in MPEG-2 transport streams [11]. Storage of MPEG-4 content in MP4 files [4]. carried on possibly different networks, and that end terminals receiving these streams from different networks are able to reconstruct the transmitted data in a synchronized manner. The spectrum of MPEG-4 end devices goes from standard computers to mobile devices, through interactive television sets. This latest device is a good example to illustrate the various ways MPEG-4 content can be consumed. The interactive TV set first receives, through the satellite connection, an MPEG-2 transport stream containing the main MPEG-2 digital TV program. When MPEG-4 is carried on MPEG-2 transport streams, some MPEG-4 content related to this TV program arrives at the terminal and provides the user with an enhanced interactive experience. This experience is based on broadcast content, such as a local interaction with a 3D model of a car in an advertisement or the navigation through a multimedia electronic program guide. Let s assume the TV set is also connected to the Internet with an ADSL link. The broadcast experience of the user is now augmented with client-server functionality as well as with richer media mixed with the TV program. One can imagine a range of services going from program enhancements with video clips streamed from the network on-demand up to multiuser games related to the TV programs with votes, 3D chats, and interaction with the scenario of the broadcast content.

17 5255ch01.qxd_sm 11/29/01 9:22 AM Page 17 1 Introduction to MPEG-4 17 MPEG-4 Browser Architecture and Tools It s now time to dig deeper into the anatomy of an MPEG-4 browser. The architecture of the browser is fully specified in [4], and most of the high-level description below is taken from [22], where it is further documented. The overall architecture of an MPEG-4 terminal is depicted in Figure 1 6. Starting at the bottom of the figure, we first encounter the particular storage or Interactive Audio-Visual Scene Display and User Interaction Composition and Rendering Object Description Scene Description Information AV Object Data... Upstream Information Compression Layer Elementary Streams Elementary Streams Interface SL SL SL SL SL SL SL Sync Layer SL-Packetized Streams DMIF Application Interface FlexMux FlexMux FlexMux (PES) MPEG-2 TS (RTP) UDP IP AAL2 ATM MP4 DAB Mux... Delivery Layer Multiplexed Streams Transmission/Storage Medium FIGURE 1 6 MPEG-4 browser architecture.

18 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start transmission medium. This refers to the lower layers of the delivery infrastructure (network layer and below, as well as storage). The transport of the MPEG-4 data can occur on a variety of delivery systems, as we have already seen. This includes MPEG-2 transport streams, RTP/UDP over IP, AAL2 on ATM, an MPEG-4 (MP4) file, or a DAB multiplexer. Most of the currently available transport layer systems provide a native means for multiplexing information. There are, however, a few instances where this is not the case (e.g., GSM data channels). In addition, the existing multiplexing mechanisms may not fit MPEG-4 needs in terms of low delay, or they may incur substantial overhead in handling the expected large number of streams associated with an MPEG-4 session. As a result, MPEG-4 has defined a multiplexing tool, FlexMux, that can optionally be used on top of the existing transport delivery layer. The delivery layer provides the MPEG-4 terminal with a number of elementary streams. These streams can contain a variety of information: audiovisual object data, scene description information, control information in the form of object descriptors, as well as meta-information that describes the content or associates intellectual property rights with it. Note that not all of the streams have to be downstream (server to client); in other words, it is possible to define elementary streams for the purpose of conveying data back from the terminal to the transmitter or server, named upstream channels. MPEG-4 standardizes both the mechanisms by which the transmission of such data is triggered at the terminal, as well as its formats as it is transmitted back to the sender. Regardless of the type of data conveyed in each elementary stream, it is important that streams provide a common mechanism for conveying timing and framing information. The Sync Layer (SL) is defined for this purpose. It is a flexible and configurable packetization facility that allows the inclusion of timing, fragmentation, and continuity information on associated data packets. Such information is attached to data units that comprise complete presentation units, for example, an entire video frame or an audio frame. These frames are called access units. Elementary streams are sent to their respective decoders that process the data and produce composition units (e.g., a decoded video frame). Control information in the form of object descriptors is used to let the receiver to know what type of information is contained in each stream. These descriptors associate sets of elementary streams to one audio or visual object, define a scene description stream, or even point to an object descriptor stream. These descriptors, in other words, are the way in which a terminal can identify the content being delivered to it. Unless a stream is described in at least one object descriptor, it is impossible for the terminal to make use of it.

19 5255ch01.qxd_sm 11/29/01 9:22 AM Page 19 1 Introduction to MPEG-4 19 Detailed descriptions on how synchronization is handled in MPEG-4 as well as the detailed mechanisms of the object descriptor framework are further described in More MPEG-4 Jump-Start. Advanced synchronization mechanisms described in More MPEG-4 Jump-Start augment this timing model to permit synchronization of multiple streams and objects that may originate from multiple sources. Flextime allows the definition of simple temporal relationships among MPEG-4 objects, such as CoStart, CoEnd, and Meet, as well as the specification of constraints for the timing relationship between MPEG-4 objects, as if the objects were on stretchable springs. At least one of the streams must be the scene description information associated with the content. The scene description information defines the spatial and temporal position of the various objects, their dynamic behavior, and any interactivity features made available to the user. As mentioned above, the audiovisual object data is actually carried in its own elementary streams. The scene description contains pointers to object descriptors when it refers to a particular audiovisual object. We should stress that it is possible that an object (in particular, a synthetic object) may be fully described by the scene description. As a result, it may not be possible to uniquely associate an audiovisual object with just one syntactic component of MPEG-4 Systems. As detailed in Chapter 3, 2D/3D Scene Composition, the scene description is tree structured and is heavily based on VRML structure. MPEG-4 provides a binary representation for the scene description. The compression efficiency of this binary representation depends heavily on the quality of the quantification. Chapter 5, Quantization in BIFS-Updates, is dedicated to this issue. Major extensions of the VRML scene description have been developed in MPEG-4 for audio composition (see More MPEG-4 Jump-Start). A key feature of the scene description is that since it is carried in its own elementary stream(s), it can contain full timing information. This implies that the scene can be dynamically updated over time, a feature that provides considerable power for content creators. In fact, the scene description tools provided by MPEG-4 also provide a special lightweight mechanism to modify parts of the scene description in order to effect animation (BIFS-Anim). Animation is accomplished by coding, in a separate stream, only the parameters that need to be updated. This mechanism is fully described in Chapter 4, BIFS-Updates, and Chapter 6, Animating Scenes in MPEG-4. The system s compositor uses the scene description information to aggregate the various natural and synthetic audiovisual object data and to render the final scene for presentation to the user. Synthetic visual objects can be as diverse as 2D

20 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start meshes (see Chapter 7, 2D Mesh Animation ), 3D meshes (see Chapter 10, 3D Mesh Coding ), facial animation (see Chapter 8, MPEG-4 Face and Body Animation Tools and Applications ), and body animation (see Chapter 9, MPEG-4 Human Virtual Body Animation ). In More MPEG-4 Jump-Start, the palette of the MPEG-4 audio tools available is described. It includes in particular speech coding from 2 to 4 kbit/s, scalable generic audio coding from 4 to 64 kbit/s, text-to-speech synthesis and synthetic music coding. An overview of the MPEG-4 video tools presents the MPEG-4 coding algorithms for shaped, scalable, and error-resilient video and provides comparison data with other coding schemes. Still-picture coding, which is of particular importance for the representation of textures in computer graphic artwork, is also developed. Profiles and Levels Most applications only need a part of the MPEG-4 tool set. MPEG-4 profiles define subsets useful for a large class of applications and services. MPEG-4 defines several types of profiles that can be combined to specify a complete audiovisual terminal. The MPEG-4 types of profiles are as follows: Scene description: These profiles define the features of the scene description and behavior that are supported. Object descriptor: These profiles define the constraints on the timing model as well as on the IPMP tools. Audio (natural and synthetic): These profiles define the types of audio objects that are supported. Visual: These profiles define the types of visual objects that are supported. Graphics: These profiles define the types of visual synthetic objects that are supported. Levels limit the number of objects and complexity for a given profile. Compliance with the MPEG-4 standard is claimed for a profile at a certain level. Although many profiles and levels are already standardized to cover most of the applications, new ones can be easily added when the need arises. The scene description tools provide mechanisms to capture user or system events. In particular, the tools allow the association of events with user operations on desired objects that can in turn modify the behavior of the stream. Event processing is the core mechanism with which application functionality and differentiation can be provided. To provide flexibility in this respect, MPEG-4

21 5255ch01.qxd_sm 11/29/01 9:22 AM Page 21 1 Introduction to MPEG-4 21 allows the use of ECMAScript (also known as JavaScript) within the scene description. Use of scripting tools is essential in order to access state information and implement sophisticated interactive applications. MPEG-4 also defines a set of Java programming language APIs (MPEG-J) through which access to an underlying MPEG-4 engine can be provided to Java applets (called MPEG-lets). This complement of tools can form the basis for very sophisticated applications, opening up completely new ways for audiovisual content creators to augment the use of their content. A complete description of the MPEG-4 application execution engine is provided in More MPEG-4 Jump- Start. It is important to point out that, in addition to the new functionalities that MPEG-4 makes available to content consumers, it provides tremendous advantages to content creators as well. The use of an object-based structure, where composition is performed at the receiver, considerably simplifies the content creation process. Starting from a set of coded audiovisual objects, it is very easy to define a scene description that combines these objects in a meaningful presentation. A similar approach is used in HTML and Web browsers, thus allowing even inexpert users to easily create their own content. The fact that the content s structure survives the process of coding and distribution also allows for its reuse. For example, content filtering or searching applications can be easily implemented by use of ancillary information carried in object descriptors. Also, users themselves can easily extract individual objects, assuming that the intellectual property information allows them to do so. Applications This section illustrates the use of MPEG-4 in some application scenarios. The snapshots were produced by real applications or prototypes. These examples are merely intended to illustrate possible uses of MPEG-4 Systems technologies in the multimedia industries. Next Generation of Portals In this application scenario, the client terminal is a multimedia terminal connected to the Internet. An example of such a terminal is a personal computer (PC) with multimedia features. The MPEG-4 based application can be received over the network from a remote server. The application can also be locally resident on a CD-ROM or on a DVD-ROM. The MPEG-4 browser can be configured as a plug-in for a standard Web browser. After having collected information about a product of interest, the user may decide to receive more information about it through a direct communication with a

22 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start FIGURE 1-7 The next generation of portals. (Courtesy of Andreas Graffunder, T-Nova, European project IST Son G. Used by permission.) vendor. He may then enter the vendor s 3D virtual shop as depicted in Figure 1 7, navigate in it, further examine products modeled in 3D, and finally, by triggering a button, start a real-time communication with the vendor. MPEG-4 Systems provides the tools for such integration of content with the notions of mixed 2D/3D scenes. Real-time presentations of streamed content, like a 2D-segmented video from the vendor, can easily be included in the scene through updates of the scene description and the object descriptors. Interactive Broadcast In this scenario, the MPEG-4 receiver may be a home set-top box, say, part of a high-end home theater, and may be connected to a high-bandwidth broadcast network at its input end. The receiver could also be a conventional multimedia terminal connected to the broadcast network. With the advent of digital broadcasts, the broadcast networks are not limited to the conventional satellite or cable networks that were the only available options until recently. The Internet can now be considered to be an instance of a broadcast network too. The key concept of MPEG-4 is create once, access everywhere, and the tools that support it allow content creators and service providers to make their content available across the entire range of available delivery systems.

23 5255ch01.qxd_sm 11/29/01 9:22 AM Page 23 1 Introduction to MPEG-4 23 FIGURE 1 8 Interactive broadcast. (Courtesy of Gianluca De Petris et al., IBC demonstration from the Advanced Interactive Content Initiative. Used by permission.) Figure 1 8 shows simple features such as interactive enhancement of the broadcast streams. They can easily be extended to support many applications, such as interactive home shopping, enriched documentary programming, advanced electronic services and program guides, interactive advertisements, interactive entertainment like sports programs or quiz shows, viewing of Web-like content, and demographically focused programming. Multimedia Conferencing and Communities In this application, as illustrated in Figure 1 9, the terminal may be a multimedia PC (equipped with camera, microphone, and speakers), or it might also be part of a high-end videoconferencing system. In fact, it can very well be that these two kinds of terminals are communicating in the same virtual environment. Indeed, the notion of shared communication space is the main idea of this application, and the use of MPEG-4 Systems to represent the shared data is the technical choice. The user connects to the conference site the same way he connects to a typical Web site, that is, through a specific address or URL. He then receives the MPEG-4 data representing the shared space. Other participants in this multimedia conference session may already be connected to each other, and therefore, the shared

24 5255ch01.qxd_sm 11/29/01 9:22 AM Page MPEG-4 Jump-Start FIGURE 1 9 Multimedia conferencing and communities. (Courtesy of Ananda Ally et al., France Telecom R&D, Oxygen project; John K. Arthur et al., Telenor, Eurescom Venus Project; and Peter Schickel, blaxxun Interactive, European IST SoNG project, respectively.) space contains streamed data representing them. Still, the application supports more than simple observation of the shared space: the user may send his own representation into the shared space by using, for example, audiovisual data streams captured from a camera and microphone. The environment can be as simple as a 2D scene, or it can be a shared 3D environment with a couple of users, but it can also be a huge virtual community. Conclusion This chapter has provided a brief introduction to MPEG, an overview of the MPEG-4 standard, and a more detailed description of the architecture and tools of the MPEG-4 specification. These descriptions have been done from various

MPEG-4: Overview. Multimedia Naresuan University

MPEG-4: Overview. Multimedia Naresuan University MPEG-4: Overview Multimedia Naresuan University Sources - Chapters 1 and 2, The MPEG-4 Book, F. Pereira and T. Ebrahimi - Some slides are adapted from NTNU, Odd Inge Hillestad. MPEG-1 and MPEG-2 MPEG-1

More information

MPEG-4. Today we'll talk about...

MPEG-4. Today we'll talk about... INF5081 Multimedia Coding and Applications Vårsemester 2007, Ifi, UiO MPEG-4 Wolfgang Leister Knut Holmqvist Today we'll talk about... MPEG-4 / ISO/IEC 14496...... is more than a new audio-/video-codec...

More information

EE Multimedia Signal Processing. Scope & Features. Scope & Features. Multimedia Signal Compression VI (MPEG-4, 7)

EE Multimedia Signal Processing. Scope & Features. Scope & Features. Multimedia Signal Compression VI (MPEG-4, 7) EE799 -- Multimedia Signal Processing Multimedia Signal Compression VI (MPEG-4, 7) References: 1. http://www.mpeg.org 2. http://drogo.cselt.stet.it/mpeg/ 3. T. Berahimi and M.Kunt, Visual data compression

More information

Delivery Context in MPEG-21

Delivery Context in MPEG-21 Delivery Context in MPEG-21 Sylvain Devillers Philips Research France Anthony Vetro Mitsubishi Electric Research Laboratories Philips Research France Presentation Plan MPEG achievements MPEG-21: Multimedia

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 18: Font compression and streaming

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 18: Font compression and streaming INTERNATIONAL STANDARD ISO/IEC 14496-18 First edition 2004-07-01 Information technology Coding of audio-visual objects Part 18: Font compression and streaming Technologies de l'information Codage des objets

More information

CARRIAGE OF MPEG-4 OVER MPEG-2 BASED SYSTEMS. Ardie Bahraini Motorola Broadband Communications Sector

CARRIAGE OF MPEG-4 OVER MPEG-2 BASED SYSTEMS. Ardie Bahraini Motorola Broadband Communications Sector CARRIAGE OF MPEG-4 OVER MPEG-2 BASED SYSTEMS Ardie Bahraini Motorola Broadband Communications Sector Abstract The MPEG-4 specifications have provided substantial advances in many areas of multimedia technology.

More information

Lesson 6. MPEG Standards. MPEG - Moving Picture Experts Group Standards - MPEG-1 - MPEG-2 - MPEG-4 - MPEG-7 - MPEG-21

Lesson 6. MPEG Standards. MPEG - Moving Picture Experts Group Standards - MPEG-1 - MPEG-2 - MPEG-4 - MPEG-7 - MPEG-21 Lesson 6 MPEG Standards MPEG - Moving Picture Experts Group Standards - MPEG-1 - MPEG-2 - MPEG-4 - MPEG-7 - MPEG-21 What is MPEG MPEG: Moving Picture Experts Group - established in 1988 ISO/IEC JTC 1 /SC

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio- INTERNATIONAL STANDARD This is a preview - click here to buy the full publication ISO/IEC 14496-15 Third edition 2014 07 01 Information technology Coding of audio- visual objects Part 15: Carriage of network

More information

ISO/IEC TR TECHNICAL REPORT. Information technology Coding of audio-visual objects Part 24: Audio and systems interaction

ISO/IEC TR TECHNICAL REPORT. Information technology Coding of audio-visual objects Part 24: Audio and systems interaction TECHNICAL REPORT ISO/IEC TR 14496-24 First edition 2008-01-15 Information technology Coding of audio-visual objects Part 24: Audio and systems interaction Technologies de l'information Codage d'objets

More information

WebGL Meetup GDC Copyright Khronos Group, Page 1

WebGL Meetup GDC Copyright Khronos Group, Page 1 WebGL Meetup GDC 2012 Copyright Khronos Group, 2012 - Page 1 Copyright Khronos Group, 2012 - Page 2 Khronos API Ecosystem Trends Neil Trevett Vice President Mobile Content, NVIDIA President, The Khronos

More information

MPEG-4 AUTHORING TOOL FOR THE COMPOSITION OF 3D AUDIOVISUAL SCENES

MPEG-4 AUTHORING TOOL FOR THE COMPOSITION OF 3D AUDIOVISUAL SCENES MPEG-4 AUTHORING TOOL FOR THE COMPOSITION OF 3D AUDIOVISUAL SCENES P. Daras I. Kompatsiaris T. Raptis M. G. Strintzis Informatics and Telematics Institute 1,Kyvernidou str. 546 39 Thessaloniki, GREECE

More information

ISO/IEC Information technology Coding of audio-visual objects Part 15: Advanced Video Coding (AVC) file format

ISO/IEC Information technology Coding of audio-visual objects Part 15: Advanced Video Coding (AVC) file format This is a preview - click here to buy the full publication INTERNATIONAL STANDARD ISO/IEC 14496-15 First edition 2004-04-15 Information technology Coding of audio-visual objects Part 15: Advanced Video

More information

INTERNATIONAL STANDARD

INTERNATIONAL STANDARD INTERNATIONAL STANDARD ISO/IEC 14496-1 Third edition 2004-11-15 Information technology Coding of audio-visual objects Part 1: Systems Technologies de l'information Codage des objets audiovisuels Partie

More information

USING METADATA TO PROVIDE SCALABLE BROADCAST AND INTERNET CONTENT AND SERVICES

USING METADATA TO PROVIDE SCALABLE BROADCAST AND INTERNET CONTENT AND SERVICES USING METADATA TO PROVIDE SCALABLE BROADCAST AND INTERNET CONTENT AND SERVICES GABRIELLA KAZAI 1,2, MOUNIA LALMAS 1, MARIE-LUCE BOURGUET 1 AND ALAN PEARMAIN 2 Department of Computer Science 1 and Department

More information

IST MPEG-4 Video Compliant Framework

IST MPEG-4 Video Compliant Framework IST MPEG-4 Video Compliant Framework João Valentim, Paulo Nunes, Fernando Pereira Instituto de Telecomunicações, Instituto Superior Técnico, Av. Rovisco Pais, 1049-001 Lisboa, Portugal Abstract This paper

More information

MPEG-21: The 21st Century Multimedia Framework

MPEG-21: The 21st Century Multimedia Framework MPEG-21: The 21st Century Multimedia Framework Jan Bormans, Jean Gelissen, and Andrew Perkis IEEE Signal Processing Magazine, March 2003 Outline Context and motivation of MPEG-21 An overview of MPEG-21

More information

MPEG-4 is a standardized digital video technology

MPEG-4 is a standardized digital video technology MPEG-4 is a standardized digital video technology What is Digital Video? What are Digital Video Standards? What is MPEG-4? How MPEG-4 is the same as other digital video technologies. How MPEG-4 is different

More information

Envivio Mindshare Presentation System. for Corporate, Education, Government, and Medical

Envivio Mindshare Presentation System. for Corporate, Education, Government, and Medical Envivio Mindshare Presentation System for Corporate, Education, Government, and Medical Introducing the Envivio Mindshare Presentation System The Envivio Mindshare Presentation system is a highly optimized

More information

November 2017 WebRTC for Live Media and Broadcast Second screen and CDN traffic optimization. Author: Jesús Oliva Founder & Media Lead Architect

November 2017 WebRTC for Live Media and Broadcast Second screen and CDN traffic optimization. Author: Jesús Oliva Founder & Media Lead Architect November 2017 WebRTC for Live Media and Broadcast Second screen and CDN traffic optimization Author: Jesús Oliva Founder & Media Lead Architect Introduction It is not a surprise if we say browsers are

More information

ISO/IEC INTERNATIONAL STANDARD

ISO/IEC INTERNATIONAL STANDARD INTERNATIONAL STANDARD ISO/IEC 14496-8 First edition 2004-05-15 Information technology Coding of audio-visual objects Part 8: Carriage of ISO/IEC 14496 contents over IP networks Technologies de l'information

More information

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 16: Interworking and digital audiovisual media accessibility

ITU-T. FG AVA TR Version 1.0 (10/2013) Part 16: Interworking and digital audiovisual media accessibility International Telecommunication Union ITU-T TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU FG AVA TR Version 1.0 (10/2013) Focus Group on Audiovisual Media Accessibility Technical Report Part 16: Interworking

More information

INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO

INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO ISO/IEC JTC1/SC29/WG11 N15071 February 2015, Geneva,

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 12: ISO base media file format

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 12: ISO base media file format INTERNATIONAL STANDARD ISO/IEC 14496-12 Third edition 2008-10-15 Information technology Coding of audio-visual objects Part 12: ISO base media file format Technologies de l'information Codage des objets

More information

R&D White Paper WHP 018. The DVB MHP Internet Access profile. Research & Development BRITISH BROADCASTING CORPORATION. January J.C.

R&D White Paper WHP 018. The DVB MHP Internet Access profile. Research & Development BRITISH BROADCASTING CORPORATION. January J.C. R&D White Paper WHP 018 January 2002 The DVB MHP Internet Access profile J.C. Newell Research & Development BRITISH BROADCASTING CORPORATION BBC Research & Development White Paper WHP 018 Title J.C. Newell

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 16: Animation Framework extension (AFX)

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 16: Animation Framework extension (AFX) INTERNATIONAL STANDARD ISO/IEC 14496-16 Fourth edition 2011-11-01 Information technology Coding of audio-visual objects Part 16: Animation Framework extension (AFX) Technologies de l'information Codage

More information

Image and video processing

Image and video processing Image and video processing Digital video Dr. Pengwei Hao Agenda Digital video Video compression Video formats and codecs MPEG Other codecs Web video - 2 - Digital Video Until the arrival of the Pentium

More information

Compression and File Formats

Compression and File Formats Compression and File Formats 1 Compressing Moving Images Methods: Motion JPEG, Cinepak, Indeo, MPEG Known as CODECs compression / decompression algorithms hardware and software implementations symmetrical

More information

MPEG-7. Multimedia Content Description Standard

MPEG-7. Multimedia Content Description Standard MPEG-7 Multimedia Content Description Standard Abstract The purpose of this presentation is to provide a better understanding of the objectives & components of the MPEG-7, "Multimedia Content Description

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia content description interface Part 5: Multimedia description schemes

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia content description interface Part 5: Multimedia description schemes INTERNATIONAL STANDARD ISO/IEC 15938-5 First edition 2003-05-15 Information technology Multimedia content description interface Part 5: Multimedia description schemes Technologies de l'information Interface

More information

MPEG 기반 AR 표준화현황. 건국대학교컴퓨터공학부윤경로 (yoonk_at_konkuk.ac.kr)

MPEG 기반 AR 표준화현황. 건국대학교컴퓨터공학부윤경로 (yoonk_at_konkuk.ac.kr) MPEG 기반 AR 표준화현황 건국대학교컴퓨터공학부윤경로 (yoonk_at_konkuk.ac.kr) CONTENTS Background of MPEG Status of MPEG-AR activities AR from MPEG s view AR Application Format (23000-13) AR Reference Model (23000-14) Use Cases

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia application format (MPEG-A) Part 4: Musical slide show application format

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia application format (MPEG-A) Part 4: Musical slide show application format INTERNATIONAL STANDARD ISO/IEC 23000-4 Second edition 2009-01-15 Information technology Multimedia application format (MPEG-A) Part 4: Musical slide show application format Technologies de l'information

More information

Skill Area 325: Deliver the Multimedia content through various media. Multimedia and Web Design (MWD)

Skill Area 325: Deliver the Multimedia content through various media. Multimedia and Web Design (MWD) Skill Area 325: Deliver the Multimedia content through various media Multimedia and Web Design (MWD) 325.1 Understanding of multimedia considerations for Internet (13hrs) 325.1.1 Analyze factors affecting

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia application format (MPEG-A) Part 13: Augmented reality application format

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia application format (MPEG-A) Part 13: Augmented reality application format INTERNATIONAL STANDARD This is a preview - click here to buy the full publication ISO/IEC 23000-13 First edition 2014-05-15 Information technology Multimedia application format (MPEG-A) Part 13: Augmented

More information

0 MPEG Systems Technologies- 27/10/2007. MPEG Systems and 3DGC Technologies Olivier Avaro Systems Chairman

0 MPEG Systems Technologies- 27/10/2007. MPEG Systems and 3DGC Technologies Olivier Avaro Systems Chairman 0 MPEG Systems Technologies- 27/10/2007 MPEG Systems and 3DGC Technologies Olivier Avaro Systems Chairman Overview of The Presentation 1 MPEG Systems Technologies- 27/10/2007 Key Standards Developed in

More information

Version 11

Version 11 The Big Challenges Networked and Electronic Media European Technology Platform The birth of a new sector www.nem-initiative.org Version 11 1. NEM IN THE WORLD The main objective of the Networked and Electronic

More information

MPEG's Dynamic Adaptive Streaming over HTTP - An Enabling Standard for Internet TV. Thomas Stockhammer Qualcomm Incorporated

MPEG's Dynamic Adaptive Streaming over HTTP - An Enabling Standard for Internet TV. Thomas Stockhammer Qualcomm Incorporated MPEG's Dynamic Adaptive Streaming over HTTP - An Enabling Standard for Internet TV Thomas Stockhammer Qualcomm Incorporated ABSTRACT Internet video is experiencing a dramatic growth in both fixed and mobile

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 22: Open Font Format

ISO/IEC INTERNATIONAL STANDARD. Information technology Coding of audio-visual objects Part 22: Open Font Format INTERNATIONAL STANDARD ISO/IEC 14496-22 First edition 2007-03-15 Information technology Coding of audio-visual objects Part 22: Open Font Format Technologies de l'information Codage des objets audiovisuels

More information

The Frozen Mountain irtc White Paper Series

The Frozen Mountain irtc White Paper Series The Frozen Mountain irtc White Paper Series This white paper is the fourth in a series on Internet Based Real Time Communications (irtc) written by Frozen Mountain Software s CTO Anton Venema. The complete

More information

Georgios Tziritas Computer Science Department

Georgios Tziritas Computer Science Department New Video Coding standards MPEG-4, HEVC Georgios Tziritas Computer Science Department http://www.csd.uoc.gr/~tziritas 1 MPEG-4 : introduction Motion Picture Expert Group Publication 1998 (Intern. Standardization

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia framework (MPEG-21) Part 21: Media Contract Ontology

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia framework (MPEG-21) Part 21: Media Contract Ontology INTERNATIONAL STANDARD ISO/IEC 21000-21 First edition 2013-07-01 Information technology Multimedia framework (MPEG-21) Part 21: Media Contract Ontology Technologies de l'information Cadre multimédia (MPEG-21)

More information

THE MPEG-4 STANDARD FOR INTERNET-BASED MULTIMEDIA APPLICATIONS

THE MPEG-4 STANDARD FOR INTERNET-BASED MULTIMEDIA APPLICATIONS Chapter 3 THE MPEG-4 STANDARD FOR INTERNET-BASED MULTIMEDIA APPLICATIONS Charles Law and Borko Furht Abstract With the development of the MPEG-4 standard in 1998, a new way of creating and interacting

More information

Tech Note - 05 Surveillance Systems that Work! Calculating Recorded Volume Disk Space

Tech Note - 05 Surveillance Systems that Work! Calculating Recorded Volume Disk Space Tech Note - 05 Surveillance Systems that Work! Surveillance Systems Calculating required storage drive (disk space) capacity is sometimes be a rather tricky business. This Tech Note is written to inform

More information

The International Intelligent Network (IN)

The International Intelligent Network (IN) The International Intelligent Network (IN) Definition In an intelligent network (IN), the logic for controlling telecommunications services migrates from traditional switching points to computer-based,

More information

INTERNATIONAL TELECOMMUNICATION UNION

INTERNATIONAL TELECOMMUNICATION UNION INTERNATIONAL TELECOMMUNICATION UNION ITU-T H.323 TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU Annex G (02/00) SERIES H: AUDIOVISUAL AND MULTIMEDIA SYSTEMS Infrastructure of audiovisual services Systems

More information

Wimba Classroom VPAT

Wimba Classroom VPAT Wimba Classroom VPAT The purpose of this Voluntary Product Accessibility Template, or VPAT, is to assist Federal contracting officials and other buyers in making preliminary assessments regarding the availability

More information

Optical Storage Technology. MPEG Data Compression

Optical Storage Technology. MPEG Data Compression Optical Storage Technology MPEG Data Compression MPEG-1 1 Audio Standard Moving Pictures Expert Group (MPEG) was formed in 1988 to devise compression techniques for audio and video. It first devised the

More information

Advanced Functionality & Commercial Aspects of DRM. Vice Chairman DRM Technical Committee,

Advanced Functionality & Commercial Aspects of DRM. Vice Chairman DRM Technical Committee, Advanced Functionality & Commercial Aspects of DRM Dipl.-Ing. Alexander Zink, MBA Vice Chairman DRM Technical Committee, DRM Treasurer, Vice President DRM Association Services & Structure Up to 4 Services

More information

ITU-T J.288. Encapsulation of type length value (TLV) packet for cable transmission systems

ITU-T J.288. Encapsulation of type length value (TLV) packet for cable transmission systems I n t e r n a t i o n a l T e l e c o m m u n i c a t i o n U n i o n ITU-T J.288 TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU (03/2016) SERIES J: CABLE NETWORKS AND TRANSMISSION OF TELEVISION, SOUND

More information

Tips on DVD Authoring and DVD Duplication M A X E L L P R O F E S S I O N A L M E D I A

Tips on DVD Authoring and DVD Duplication M A X E L L P R O F E S S I O N A L M E D I A Tips on DVD Authoring and DVD Duplication DVD Authoring - Introduction The postproduction business has certainly come a long way in the past decade or so. This includes the duplication/authoring aspect

More information

XF Rendering Server 2008

XF Rendering Server 2008 XF Rendering Server 2008 Using XSL Formatting Objects for Producing and Publishing Business Documents Abstract IT organizations are under increasing pressure to meet the business goals of their companies.

More information

RTT TECHNOLOGY TOPIC October The wireless web

RTT TECHNOLOGY TOPIC October The wireless web RTT TECHNOLOGY TOPIC October 2000 The wireless web In previous HOT TOPICS we have tracked how the traffic mix has changed/is changing from a predominantly voice based medium to a mix of voice and non-voice

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia content description interface Part 1: Systems

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia content description interface Part 1: Systems INTERNATIONAL STANDARD ISO/IEC 15938-1 First edition 2002-07-01 Information technology Multimedia content description interface Part 1: Systems Technologies de l'information Interface de description du

More information

Internet. Class-In charge: S.Sasirekha

Internet. Class-In charge: S.Sasirekha Internet Class-In charge: S.Sasirekha COMPUTER NETWORK A computer network is a collection of two or more computers, which are connected together to share information and resources. Network Operating Systems

More information

A Plexos International Network Operating Technology May 2006

A Plexos International Network Operating Technology May 2006 A Plexos International Network Operating Technology May 2006 BY 4664 Jamestown Ave, Suite 325 Baton Rouge, LA 70808 225.218.8002 1.0 Introduction. is a software environment comprised of proven technologies

More information

Intel Authoring Tools for UPnP* Technologies

Intel Authoring Tools for UPnP* Technologies Intel Authoring Tools for UPnP* Technologies (Version 1.00, 05-07-2003) INFORMATION IN THIS DOCUMENT IS PROVIDED IN CONNECTION WITH INTEL PRODUCTS. NO LICENSE, EXPRESS OR IMPLIED, BY ESTOPPEL OR OTHERWISE,

More information

Introduction to LAN/WAN. Application Layer 4

Introduction to LAN/WAN. Application Layer 4 Introduction to LAN/WAN Application Layer 4 Multimedia Multimedia: Audio + video Human ear: 20Hz 20kHz, Dogs hear higher freqs DAC converts audio waves to digital E.g PCM uses 8-bit samples 8000 times

More information

ISO/IEC Information technology Multimedia framework (MPEG-21) Part 3: Digital Item Identification

ISO/IEC Information technology Multimedia framework (MPEG-21) Part 3: Digital Item Identification INTERNATIONAL STANDARD ISO/IEC 21000-3 First edition 2003-04-01 Information technology Multimedia framework (MPEG-21) Part 3: Digital Item Identification Technologies de l'information Cadre multimédia

More information

Overview of the MPEG-4 Standard

Overview of the MPEG-4 Standard Page 1 of 78 INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO ISO/IEC JTC1/SC29/WG11 N4668 March 2002

More information

Increazing interactivity in IPTV using MPEG-21 descriptors

Increazing interactivity in IPTV using MPEG-21 descriptors Increazing interactivity in IPTV using MPEG-21 descriptors Christos-Nikolaos Anagnostopoulos 1, George Tsekouras 1, Damianos Gavalas 1, Daphne Economou 1 and Ioannis Psoroulas 2 1 University of the Aegean,

More information

Avid Viewpoint: The Promise of AS-02

Avid Viewpoint: The Promise of AS-02 Avid Viewpoint: The Promise of AS-02 9 September, 2011 Corporate Headquarters 800 949 AVID (2843) Asian Headquarters +65 6476 7666 European Headquarters +44 1753 655999 To find your regional Avid office,

More information

ISO/IEC Information technology Multimedia content description interface Part 7: Conformance testing

ISO/IEC Information technology Multimedia content description interface Part 7: Conformance testing This is a preview - click here to buy the full publication INTERNATIONAL STANDARD ISO/IEC 15938-7 First edition 2003-12-01 Information technology Multimedia content description interface Part 7: Conformance

More information

IPTV Explained. Part 1 in a BSF Series.

IPTV Explained. Part 1 in a BSF Series. IPTV Explained Part 1 in a BSF Series www.aucklandsatellitetv.co.nz I N T R O D U C T I O N As a result of broadband service providers moving from offering connectivity to services, the discussion surrounding

More information

White Paper: Delivering Enterprise Web Applications on the Curl Platform

White Paper: Delivering Enterprise Web Applications on the Curl Platform White Paper: Delivering Enterprise Web Applications on the Curl Platform Table of Contents Table of Contents Executive Summary... 1 Introduction... 2 Background... 2 Challenges... 2 The Curl Solution...

More information

Types and Methods of Content Adaptation. Anna-Kaisa Pietiläinen

Types and Methods of Content Adaptation. Anna-Kaisa Pietiläinen Types and Methods of Content Adaptation Anna-Kaisa Pietiläinen Agenda Introduction Multimedia Content Types Types of Adaptation Methods of Adaptation Conclusion Networks 2 Introduction Networks 3 The Problem

More information

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures. Date: 2 September 2013 Voluntary Accessibility Template (VPAT) This Voluntary Product Accessibility Template (VPAT) describes accessibility of Polycom s VVX500 and 600 product families against the criteria

More information

Software. Networked multimedia. Buffering of media streams. Causes of multimedia. Browser based architecture. Programming

Software. Networked multimedia. Buffering of media streams. Causes of multimedia. Browser based architecture. Programming 1 Software Networked multimedia Introduction Browser based software architecture Distributed software Servers Network Terminals User interface Middleware Communications Network multimedia can be defined

More information

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures.

Note: This document describes normal operational functionality. It does not include maintenance and troubleshooting procedures. Date: 08 Nov 2017 Voluntary Accessibility Template (VPAT) This Voluntary Product Accessibility Template (VPAT) describes accessibility of Polycom s RealPresence Immersive Studio against the criteria described

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology MPEG audio technologies Part 3: Unified speech and audio coding

ISO/IEC INTERNATIONAL STANDARD. Information technology MPEG audio technologies Part 3: Unified speech and audio coding INTERNATIONAL STANDARD This is a preview - click here to buy the full publication ISO/IEC 23003-3 First edition 2012-04-01 Information technology MPEG audio technologies Part 3: Unified speech and audio

More information

ISO/IEC INTERNATIONAL STANDARD

ISO/IEC INTERNATIONAL STANDARD INTERNATIONAL STANDARD ISO/IEC 13818-11 First edition 2004-02-01 Information technology Generic coding of moving pictures and associated audio information Part 11: IPMP on MPEG-2 systems Technologies de

More information

!!!!!! Portfolio Summary!! for more information July, C o n c e r t T e c h n o l o g y

!!!!!! Portfolio Summary!! for more information  July, C o n c e r t T e c h n o l o g y Portfolio Summary July, 2014 for more information www.concerttechnology.com bizdev@concerttechnology.com C o n c e r t T e c h n o l o g y Overview The screenplay project covers emerging trends in social

More information

SERIES J: CABLE NETWORKS AND TRANSMISSION OF TELEVISION, SOUND PROGRAMME AND OTHER MULTIMEDIA SIGNALS Digital transmission of television signals

SERIES J: CABLE NETWORKS AND TRANSMISSION OF TELEVISION, SOUND PROGRAMME AND OTHER MULTIMEDIA SIGNALS Digital transmission of television signals International Telecommunication Union ITU-T J.281 TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU (03/2005) SERIES J: CABLE NETWORKS AND TRANSMISSION OF TELEVISION, SOUND PROGRAMME AND OTHER MULTIMEDIA

More information

Interchange formats. Introduction Application areas Requirements Track and object model Real-time transfer Different interchange formats Comparison

Interchange formats. Introduction Application areas Requirements Track and object model Real-time transfer Different interchange formats Comparison Interchange formats Introduction Application areas Requirements Track and object model Real-time transfer Different interchange formats Comparison Petri Vuorimaa 1 Introduction In transfer of multimedia

More information

New Media Production week 3

New Media Production week 3 New Media Production week 3 Multimedia ponpong@gmail.com What is Multimedia? Multimedia = Multi + Media Multi = Many, Multiple Media = Distribution tool & information presentation text, graphic, voice,

More information

Streaming video. Video on internet. Streaming video, live or on demand (VOD)

Streaming video. Video on internet. Streaming video, live or on demand (VOD) Streaming video 1 Video on internet. There are two different basic ways of presenting video on internet: The receiver downloads the entire video file and than plays it using some kind of media player The

More information

Wireless Signaling and Intelligent Networking

Wireless Signaling and Intelligent Networking 3 Wireless Signaling and Intelligent Networking The first two chapters provided an introduction to the history of mobile communications, its evolution, and the wireless industry standards process. With

More information

Components and Application Frameworks

Components and Application Frameworks CHAPTER 1 Components and Application Frameworks 1.1 INTRODUCTION Welcome, I would like to introduce myself, and discuss the explorations that I would like to take you on in this book. I am a software developer,

More information

AR Standards Update Austin, March 2012

AR Standards Update Austin, March 2012 AR Standards Update Austin, March 2012 Neil Trevett President, The Khronos Group Vice President Mobile Content, NVIDIA Copyright Khronos Group, 2012 - Page 1 Topics Very brief overview of Khronos Update

More information

Overview of the MPEG-4 Version 1 Standard

Overview of the MPEG-4 Version 1 Standard INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO ISO/IEC JTC1/SC29/WG11 N1909 MPEG97 Oct 1997/Fribourg

More information

MISB EG Motion Imagery Standards Board Engineering Guideline. 24 April Delivery of Low Bandwidth Motion Imagery. 1 Scope.

MISB EG Motion Imagery Standards Board Engineering Guideline. 24 April Delivery of Low Bandwidth Motion Imagery. 1 Scope. Motion Imagery Standards Board Engineering Guideline Delivery of Low Bandwidth Motion Imagery MISB EG 0803 24 April 2008 1 Scope This Motion Imagery Standards Board (MISB) Engineering Guideline (EG) provides

More information

Interworking Between SIP and MPEG-4 DMIF For Heterogeneous IP Video Conferencing

Interworking Between SIP and MPEG-4 DMIF For Heterogeneous IP Video Conferencing Interworking Between SIP and DMIF For Heterogeneous IP Video Conferencing Toufik Ahmed 1, Ahmed Mehaoua 1 and Raouf Boutaba 2 1 University of Versailles, CNRS-PRiSM Lab. 45 av. des Etats-Unis, 78000, Versailles,

More information

The Virtual Meeting Room

The Virtual Meeting Room Contact Details of Presenting Authors Stefan Rauthenberg (rauthenberg@hhi.de), Peter Kauff (kauff@hhi.de) Tel: +49-30-31002 266, +49-30-31002 615 Fax: +49-30-3927200 Summation Brief explaination of the

More information

3G Powered 3G-324M Protocol

3G Powered 3G-324M Protocol 3G Powered 3G-324M Protocol NOTICE 2002 RADVISION Ltd. All intellectual property rights in this publication are owned by RADVISION Ltd. and are protected by United States copyright laws, other applicable

More information

5th AR Standards Community Meeting March 19-20, Austin, US Marius Preda Institut TELECOM

5th AR Standards Community Meeting March 19-20, Austin, US Marius Preda Institut TELECOM MPEG Technologies and Roadmap for AR 5th AR Standards Community Meeting March 19-20, Austin, US Marius Preda Institut TELECOM What is MPEG? A family of standards published by ISO/IEC dealing with: Coding/compression

More information

6 Computer Networks 6.1. Foundations of Computer Science Cengage Learning

6 Computer Networks 6.1. Foundations of Computer Science Cengage Learning 6 Computer Networks 6.1 Foundations of Computer Science Cengage Learning Objectives After studying this chapter, the student should be able to: 6.2 Describe network criteria, physical structures and categories

More information

Do not turn this page over until instructed to do so by the Senior Invigilator.

Do not turn this page over until instructed to do so by the Senior Invigilator. CARDIFF CARDIFF UNIVERSITY EXAMINATION PAPER SOLUTIONS Academic Year: 2000-2001 Examination Period: Lent 2001 Examination Paper Number: CMP632 Examination Paper Title: Multimedia Systems Duration: 2 hours

More information

Offering Access to Personalized Interactive Video

Offering Access to Personalized Interactive Video Offering Access to Personalized Interactive Video 1 Offering Access to Personalized Interactive Video Giorgos Andreou, Phivos Mylonas, Manolis Wallace and Stefanos Kollias Image, Video and Multimedia Systems

More information

ITU-T Y Next generation network evolution phase 1 Overview

ITU-T Y Next generation network evolution phase 1 Overview I n t e r n a t i o n a l T e l e c o m m u n i c a t i o n U n i o n ITU-T Y.2340 TELECOMMUNICATION STANDARDIZATION SECTOR OF ITU (09/2016) SERIES Y: GLOBAL INFORMATION INFRASTRUCTURE, INTERNET PROTOCOL

More information

9/8/2016. Characteristics of multimedia Various media types

9/8/2016. Characteristics of multimedia Various media types Chapter 1 Introduction to Multimedia Networking CLO1: Define fundamentals of multimedia networking Upon completion of this chapter students should be able to define: 1- Multimedia 2- Multimedia types and

More information

An Adaptive MPEG-4 Streaming System Based on Object Prioritisation

An Adaptive MPEG-4 Streaming System Based on Object Prioritisation ISSC 2003, Limerick. July 1-2 An Adaptive MPEG-4 Streaming System Based on Object Prioritisation Stefan A. Goor and Liam Murphy Performance Engineering Laboratory, Department of Computer Science, University

More information

MASERGY S MANAGED SD-WAN

MASERGY S MANAGED SD-WAN MASERGY S MANAGED New Performance Options for Hybrid Networks Business Challenges WAN Ecosystem Features and Benefits Use Cases INTRODUCTION Organizations are leveraging technology to transform the way

More information

TRIBHUVAN UNIVERSITY Institute of Engineering Pulchowk Campus Department of Electronics and Computer Engineering

TRIBHUVAN UNIVERSITY Institute of Engineering Pulchowk Campus Department of Electronics and Computer Engineering TRIBHUVAN UNIVERSITY Institute of Engineering Pulchowk Campus Department of Electronics and Computer Engineering A Final project Report ON Minor Project Java Media Player Submitted By Bisharjan Pokharel(061bct512)

More information

CONTENT MODEL FOR MOBILE ADAPTATION OF MULTIMEDIA INFORMATION

CONTENT MODEL FOR MOBILE ADAPTATION OF MULTIMEDIA INFORMATION CONTENT MODEL FOR MOBILE ADAPTATION OF MULTIMEDIA INFORMATION Maija Metso, Antti Koivisto and Jaakko Sauvola MediaTeam, MVMP Unit Infotech Oulu, University of Oulu e-mail: {maija.metso, antti.koivisto,

More information

TECHNOLOGIES USED IN MULTIMEDIA SYSTEMS AND THEIR APPLICATIONS

TECHNOLOGIES USED IN MULTIMEDIA SYSTEMS AND THEIR APPLICATIONS TECHNOLOGIES USED IN MULTIMEDIA SYSTEMS AND THEIR APPLICATIONS Prepared for Mr. John Williams English 214 07 Technical Report Writing by Mohammed Al- Hajjaj 212417 Electrical Engineering Department Abstract

More information

Completing the Multimedia Architecture

Completing the Multimedia Architecture Copyright Khronos Group, 2011 - Page 1 Completing the Multimedia Architecture Erik Noreke Chair of OpenSL ES Working Group Chair of OpenMAX AL Working Group Copyright Khronos Group, 2011 - Page 2 Today

More information

The University of Queensland

The University of Queensland UQ Cyber Security Strategy 2017-2020 NAME: UQ Cyber Security Strategy DATE: 21/07/2017 RELEASE:0.2 Final AUTHOR: OWNER: CLIENT: Marc Blum Chief Information Officer Strategic Information Technology Council

More information

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia service platform technologies Part 3: Conformance and reference software

ISO/IEC INTERNATIONAL STANDARD. Information technology Multimedia service platform technologies Part 3: Conformance and reference software INTERNATIONAL STANDARD ISO/IEC 23006-3 Second edition 2013-09-15 Information technology Multimedia service platform technologies Part 3: Conformance and reference software Technologies de l'information

More information

Workshop W14 - Audio Gets Smart: Semantic Audio Analysis & Metadata Standards

Workshop W14 - Audio Gets Smart: Semantic Audio Analysis & Metadata Standards Workshop W14 - Audio Gets Smart: Semantic Audio Analysis & Metadata Standards Jürgen Herre for Integrated Circuits (FhG-IIS) Erlangen, Germany Jürgen Herre, hrr@iis.fhg.de Page 1 Overview Extracting meaning

More information

ISO/IEC INTERNATIONAL STANDARD

ISO/IEC INTERNATIONAL STANDARD NTERNATONAL STANDARD SO/EC 11172-1 First edition 1993-08-0 1 nformation technology - Coding of moving pictures and associated audio for digital storage media at up to about 1,5 Mbit/s - Part 1: Systems

More information

Migration to Service Oriented Architecture Using Web Services Whitepaper

Migration to Service Oriented Architecture Using Web Services Whitepaper WHITE PAPER Migration to Service Oriented Architecture Using Web Services Whitepaper Copyright 2004-2006, HCL Technologies Limited All Rights Reserved. cross platform GUI for web services Table of Contents

More information

Generalized Document Data Model for Integrating Autonomous Applications

Generalized Document Data Model for Integrating Autonomous Applications 6 th International Conference on Applied Informatics Eger, Hungary, January 27 31, 2004. Generalized Document Data Model for Integrating Autonomous Applications Zsolt Hernáth, Zoltán Vincellér Abstract

More information