Module 10 MULTIMEDIA SYNCHRONIZATION

Similar documents
Module 7 VIDEO CODING AND MOTION ESTIMATION

Multimedia Systems. Lehrstuhl für Informatik IV RWTH Aachen. Prof. Dr. Otto Spaniol Dr. rer. nat. Dirk Thißen

Scalable Video Coding

Interframe coding A video scene captured as a sequence of frames can be efficiently coded by estimating and compensating for motion between frames pri

Outline Introduction MPEG-2 MPEG-4. Video Compression. Introduction to MPEG. Prof. Pratikgiri Goswami

Lecture 5: Error Resilience & Scalability

Compression and File Formats

15 Data Compression 2014/9/21. Objectives After studying this chapter, the student should be able to: 15-1 LOSSLESS COMPRESSION

Digital video coding systems MPEG-1/2 Video

9/8/2016. Characteristics of multimedia Various media types

IST MPEG-4 Video Compliant Framework

CMPT 365 Multimedia Systems. Media Compression - Video

How to achieve low latency audio/video streaming over IP network?

DIGITAL TELEVISION 1. DIGITAL VIDEO FUNDAMENTALS

Video Quality Analysis for H.264 Based on Human Visual System

In the name of Allah. the compassionate, the merciful

ES623 Networked Embedded Systems

EE 5359 H.264 to VC 1 Transcoding

SMIL 2.0. Synchronized Multimedia Integration Language. Espen Nilsen

Video Compression MPEG-4. Market s requirements for Video compression standard

Cambrian College Online Course Review Checklist

Using animation to motivate motion

Advanced Video Coding: The new H.264 video compression standard

Multimedia Technology (IT-204-F) Section A Introduction to multimedia. Lecture 7. Multimedia Database

Fast Decision of Block size, Prediction Mode and Intra Block for H.264 Intra Prediction EE Gaurav Hansda

Multimedia Standards

COMP 249 Advanced Distributed Systems Multimedia Networking. Performance of Multimedia Delivery on the Internet Today

Lecture 3 Image and Video (MPEG) Coding

COMP 249 Advanced Distributed Systems Multimedia Networking. Multimedia Applications & User Requirements

Image and video processing

Chapter 9. Multimedia Networking. Computer Networking: A Top Down Approach

The RTP Encapsulation based on Frame Type Method for AVS Video

MPEG-4: Simple Profile (SP)

Problem Set 4. Danfei Xu CS 231A March 9th, (Courtesy of last year s slides)

Multimedia Databases. Wolf-Tilo Balke Younès Ghammad Institut für Informationssysteme Technische Universität Braunschweig

Do not turn this page over until instructed to do so by the Senior Invigilator.

Multimedia Databases. 9 Video Retrieval. 9.1 Hidden Markov Model. 9.1 Hidden Markov Model. 9.1 Evaluation. 9.1 HMM Example 12/18/2009

! Exploit limited awareness of users. " JPEG/MPEG video and image compression. " MP3 audio compression. ! Adapt to changing resource availability

CODING METHOD FOR EMBEDDING AUDIO IN VIDEO STREAM. Harri Sorokin, Jari Koivusaari, Moncef Gabbouj, and Jarmo Takala

ECE 417 Guest Lecture Video Compression in MPEG-1/2/4. Min-Hsuan Tsai Apr 02, 2013

Module 6 STILL IMAGE COMPRESSION STANDARDS

Audio and video compression

Multimedia Data and Its Encoding

Networking Applications

Internet Streaming Media

LECTURE VIII: BASIC VIDEO COMPRESSION TECHNIQUE DR. OUIEM BCHIR

Recommended Readings

Module 10 MULTIMEDIA SYNCHRONIZATION

Lecture Encoding of Distance Education by Multimedia Integration

ITEC310 Computer Networks II

TKT-2431 SoC design. Introduction to exercises

e-snaps Online Training Navigation Tutorial

Hello, I am from the State University of Library Studies and Information Technologies, Bulgaria

Advances of MPEG Scalable Video Coding Standard

Multimedia Networking

Joint Impact of MPEG-2 Encoding Rate and ATM Cell Losses on Video Quality

The Korrontea Data Modeling Philippe Roose LIUPPA IUT de Bayonne Place Paul Bert Bayonne, France +33(0)

Fundamentals of Video Compression. Video Compression

CSE 4/60373: Multimedia Systems

Image and Video Coding I: Fundamentals

TKT-2431 SoC design. Introduction to exercises. SoC design / September 10

COMPUTERIZED OFFICE SUPPORT PROGRAM

Mark Kogan CTO Video Delivery Technologies Bluebird TV

A MULTIPOINT VIDEOCONFERENCE RECEIVER BASED ON MPEG-4 OBJECT VIDEO. Chih-Kai Chien, Chen-Yu Tsai, and David W. Lin

Multimedia Systems Video II (Video Coding) Mahdi Amiri April 2012 Sharif University of Technology

Internet Streaming Media. Reji Mathew NICTA & CSE UNSW COMP9519 Multimedia Systems S2 2006

Chapter 8 OSI Physical Layer

Streaming (Multi)media

Lesson 7 Software Fundamentals

Laboratoire d'informatique, de Robotique et de Microélectronique de Montpellier Montpellier Cedex 5 France

Lecture 10: Tutorial Sakrapee (Paul) Paisitkriangkrai Evan Tan A/Prof. Jian Zhang

MPEG-2. ISO/IEC (or ITU-T H.262)

Page 1. Outline / Computer Networking : 1 st Generation Commercial PC/Packet Video Technologies

High Efficiency Video Coding. Li Li 2016/10/18

CS 218 F Nov 3 lecture: Streaming video/audio Adaptive encoding (eg, layered encoding) TCP friendliness. References:

RECOMMENDATION ITU-R BT.1720 *

CSCD 433/533 Advanced Networks Fall Lecture 14 RTSP and Transport Protocols/ RTP

Internet Streaming Media

VC 12/13 T16 Video Compression

Multi-Class Segmentation with Relative Location Prior

: 65% to 84% - M for Merit : 50% to 64% - P for Pass : 0% to 49% - R for Referral

System Modeling and Implementation of MPEG-4. Encoder under Fine-Granular-Scalability Framework

1. Inroduction to Data Mininig

2014 Summer School on MPEG/VCEG Video. Video Coding Concept

Jeff Hinson CS525, Spring 2010

Welcome Back to Fundamentals of Multimedia (MR412) Fall, 2012 Chapter 10 ZHU Yongxin, Winson

Professor Laurence S. Dooley. School of Computing and Communications Milton Keynes, UK

Presentation Agenda. Hardware Setup. Desktop Video System. Software. Software

To address these challenges, extensive research has been conducted and have introduced six key areas of streaming video, namely: video compression,

Clustering & Dimensionality Reduction. 273A Intro Machine Learning

Digital Image Processing

Binju Bentex *1, Shandry K. K 2. PG Student, Department of Computer Science, College Of Engineering, Kidangoor, Kottayam, Kerala, India

Lesson 6. MPEG Standards. MPEG - Moving Picture Experts Group Standards - MPEG-1 - MPEG-2 - MPEG-4 - MPEG-7 - MPEG-21

AN IMPROVISED LOSSLESS DATA-HIDING MECHANISM FOR IMAGE AUTHENTICATION BASED HISTOGRAM MODIFICATION

RTP: A Transport Protocol for Real-Time Applications

MPEG-l.MPEG-2, MPEG-4

SmartBuilder Section 508 Accessibility Guidelines

Assuring Media Quality in IP Video Networks. Jim Welch IneoQuest Technologies

Internet Streaming Media. Reji Mathew NICTA & CSE UNSW COMP9519 Multimedia Systems S2 2007

Video Compression An Introduction

Transcription:

Module 10 MULTIMEDIA SYNCHRONIZATION

Lesson 33 Basic definitions and requirements

Instructional objectives At the end of this lesson, the students should be able to: 1. Define synchronization between media objects. 2. Distinguish between time independent and time dependent media objects. 3. Define continuous media objects. 4. Define intra-object synchronization and inter-object synchronization 5. Define Logical Data Units (LDU) with examples. 33.0 Introduction In the previous lessons, we have discussed at length about some of the major components of multimedia communication, like images, video and audio. A multimedia presentation is always incomplete unless the individual media streams composing the presentation are presented in a meaningful way. This is addressed by multimedia synchronization, the topic of the current module. The present lesson first outlines some of the fundamental concepts and definitions related to multimedia synchronization. This is followed by a four layer synchronization reference model, which is needed to understand the various requirements for the run time mechanisms, identify interfaces between run time mechanisms and comprising the system solutions 33.1 Synchronization between media objects By now we clearly understand what is a media object, but we are required to understand what relationships should exist between the media objects in a multimedia presentation. The relationships may be temporal, spatial or content based. The word synchronization refers to time, but in a broader sense, the definition of synchronization could be extended to content or the spatial relationships. In a strict sense, multimedia synchronization would mean temporal relationships between the media objects. A common example of temporal relationship is movie or television, where both audio and video objects are involved. There it is essential to maintain the same temporal relationship, as it would exist for recording, during the replay too. A typical example of content relationship is a spreadsheet and a graphics that correspond to the spreadsheet data. In this case, the same data are represented in two different ways.

The spatial relationships refer to the space used for presenting a media object to an output device at a certain point of time. It therefore defines where in space (usually the two dimensional display screen) should the media objects be placed. 33.2 Time-independent and time-dependent media objects Before we study the mechanisms of temporal synchronization between the media objects, we should understand the difference between time independent and time-dependent media objects. We may note a fundamental difference between how an image or a text is transmitted and how a video or an audio is transmitted. In the former type, we may transmit and update the image or text from left to right, or right to left, or top or bottom (the usual way!) or the bottom to top. In the reconstructed presentation, the semantics of the content do not change whichever way we do, provided the same order is maintained at the transmitter and the receiver. There may be pauses while updating, but still nothing matters to the semantics. However, for a video or an audio, the order is important. We cannot run a video or an audio presentation in the reverse order, since the semantics will be lost. If a pause is introduced, annoying effects, loss of intelligibility (for audio) would result. From these considerations, we can state the difference between time-dependent and time-independent media objects. A time-dependent media object always has a media stream associated with it. The media stream can be divided into units and temporal relationship exists between the units of the media stream. A time-independent media object is a traditional media object like image, graphics or text. The semantic of the media content does not depend upon a presentation in time domain. A time-dependent media object is said to be a continuous media object if the presentation durations of the unit of the media stream are all equal. For example, video is a continuous media object, since all units (frames) are of equal durations. 33.3 Intra and inter-object synchronization Synchronization requirements exist within one time-dependent media object and also in between several media objects. Intra object synchronization refers to the temporal relationship between various presentation units of a time-dependent media object. For example, if a video presentation has a refresh rate 30 frames/ second, then each frame should be

displayed for 33 ms. Otherwise, the synchronization will be lost. Lack of intra object synchronization results in pauses or jerks. Inter-object synchronization refers to the temporal relationship between several media objects, which may be time-independent or time dependent fig.33.1 shows an example of inter media synchronization. In this example, we first have a video, which has a synchronized audio with it. This is followed by several images, which is again followed by an animation, synchronized with audio commentary. Video Animation Image Audio Audio time Fig. 33.1. Example of inter-media synchronization. If in fig 33.1, we add to temporal offset to the audio with respect to the video or the animation, a lack of inter object synchronization results. 33.4 Logical Data Unit (LDU) We now make a clear distinction between the presentation units and logical data units of a time dependent media object. For example, in a video sequence, each frame always represents a presentation unit. The logical data unit (LDU) on the other hand refers to an information unit, which may not be the same as that of the presentation unit. How we define the information unit depends upon the level of granularity. Each frame consists of macroblocks of size 16 x 16 pixels. Each macroblock may compose an LDU. If we increase the level of granularity further, each pixel may also form an LDU. Each LDU may not contain the same amount of information. For example, if we consider the compressed frames of a video sequence as LDUs, each compressed frames may require different number of bits to encode. LDUs may also classified into closed and open LDUs. Closed LDUs are the ones having predictable duration, like the stored continuous media, e.g. audio and video. Open LDUs typically represent input from a live source. Some typically examples of LDU based syncronization are shown below:

Lip synchronization, which requires tight coupling between audio and video, as illustrated in fig 33.2 Fig 33.2 LDU view of lip synchronization The synchronization is specified in terms of maximal skew between the video and the audio streams. Slide show with audio commentary, which demands that the change of slides should be temporally related to the audio commentary. Synchronized multimedia presentation through interactions: This may be done in a variety of ways. For example, a synchronized audio/ video presentation could be followed by an interactive session, where the user may exercise own choice to view a set of slides or another synchronized presentation, followed by interactions etc.