The Transmitted Strategy of Proxy Cache Based on Segmented Video

Similar documents
The Strategy of Batch Using Dynamic Cache for Streaming Media

Optimal Proxy Cache Allocation for Efficient Streaming Media Distribution

Using Multicast for Streaming Videos across Wide Area Networks

Using Multicast for Streaming Videos across Wide Area Networks

Optimal Proxy Cache Allocation for Efficient Streaming Media Distribution

An Energy-Efficient Client Pre-Caching Scheme with Wireless Multicast for Video-on-Demand Services

RECURSIVE PATCHING An Efficient Technique for Multicast Video Streaming

Impact of Frequency-Based Cache Management Policies on the Performance of Segment Based Video Caching Proxies

COPACC: A Cooperative Proxy-Client Caching System for On-demand Media Streaming

Improving VoD System Efficiency with Multicast and Caching

Proxy Caching for Video on Demand Systems in Multicasting Networks

IEEE TRANSACTIONS ON MULTIMEDIA, VOL. 8, NO. 2, APRIL Segment-Based Streaming Media Proxy: Modeling and Optimization

Joint Server Scheduling and Proxy Caching for Video Delivery

Threshold-Based Multicast for Continuous Media Delivery Ý

HSM: A Hybrid Streaming Mechanism for Delay-tolerant Multimedia Applications Annanda Th. Rath 1 ), Saraswathi Krithivasan 2 ), Sridhar Iyer 3 )

Proxy Caching Algorithm based on Segment Group Popularity for Streaming Media

Scalability And The Bandwidth Efficiency Of Vod Systems K.Deepathilak et al.,

Threshold-Based Multicast for Continuous Media Delivery y

On the Power of Cooperation in Multimedia Caching

Streaming Flow Analyses for Prefetching in Segment-based Proxy Caching to Improve Media Delivery Quality

Evolved Multimedia Broadcast/Multicast Service (embms) in LTE-advanced

COOCHING: Cooperative Prefetching Strategy for P2P Video-on-Demand System

A Packet-Based Caching Proxy with Loss Recovery for Video Streaming

Collapsed Cooperative Video Cache for Content Distribution Networks

A Simulation-Based Analysis of Scheduling Policies for Multimedia Servers

Buffer Management Scheme for Video-on-Demand (VoD) System

A COOPERATIVE DISTRIBUTION PROTOCOL FOR VIDEO-ON-DEMAND

INF5071 Performance in distributed systems Distribution Part II

Selecting among Replicated Batching Video-on-Demand Servers

Reduction of Periodic Broadcast Resource Requirements with Proxy Caching

VoD QAM Resource Allocation Algorithms

A Proxy Caching Scheme for Continuous Media Streams on the Internet

Performance Evaluation of Distributed Prefetching for Asynchronous Multicast in P2P Networks

Time Domain Modeling of Batching under User Interaction and Dynamic Adaptive Piggybacking Schemes 1

Loopback: Exploiting Collaborative Caches for Large-Scale Streaming

Provisioning Content Distribution Networks for Streaming Media

Analysis of Resource Sharing and Cache Management in Scalable Video-on-Demand

INF5071 Performance in distributed systems Distribution Part II

Providing VCR in a Distributed Client Collaborative Multicast Video Delivery Scheme

Broadcasting Video With the Knowledge of User Delay Preference

A Survey of Streaming Media Caching

On the Feasibility of Prefetching and Caching for Online TV Services: A Measurement Study on

Scalability of Multicast Delivery for Non-sequential Streaming Access

Slice-and-Patch An Algorithm to Support VBR Video Streaming in a Multicast-based Video-on-Demand System *

Hd Video Transmission Over Wireless Network Using Batching And Patching Technique

Server Selection in Large-Scale Video-on-Demand Systems

Performance and Waiting-Time Predictability Analysis of Design Options in Cost-Based Scheduling for Scalable Media Streaming

Performance Analysis of Cell Switching Management Scheme in Wireless Packet Communications

Configuring Storm Control

Resilient Video-on-Demand streaming over P2P networks

THE CACHE REPLACEMENT POLICY AND ITS SIMULATION RESULTS

Scaled VIP Algorithms for Joint Dynamic Forwarding and Caching in Named Data Networks

On a Unified Architecture for Video-on-Demand Services

LEMP: Lightweight Efficient Multicast Protocol for Video on Demand

An Integration Approach of Data Mining with Web Cache Pre-Fetching

On the Analysis of Caches with Pending Interest Tables

Partial Caching Scheme for Streaming Multimedia Data in Ad-hoc Network

Improving Scalability of VoD Systems by Optimal Exploitation of Storage and Multicast

Segment-Based Proxy Caching of Multimedia Streams

Frame-Based Periodic Broadcast and Fundamental Resource Tradeoffs

WITH the advanced technologies recently developed in the

OPTIMIZING VIDEO-ON-DEMAND WITH SOURCE CODING

Web-based Energy-efficient Cache Invalidation in Wireless Mobile Environment

Provably Efficient Stream Merging

To address these challenges, extensive research has been conducted and have introduced six key areas of streaming video, namely: video compression,

Overlay Networks for Multimedia Contents Distribution

Surveying Formal and Practical Approaches for Optimal Placement of Replicas on the Web

Multipath Routing for Video Unicast over Bandwidth-Limited Networks

Analysis of Waiting-Time Predictability in Scalable Media Streaming

OVSF Code Tree Management for UMTS with Dynamic Resource Allocation and Class-Based QoS Provision

Wei Wang, Mehul Motani and Vikram srinivasan Department of Electrical & Computer Engineering National University of Singapore, Singapore

Proxy Caching for Media Streaming over the Internet

Performance Analysis of Storage-Based Routing for Circuit-Switched Networks [1]

Media Caching Support for Mobile Transit Clients

Price-Sensitive Application Adaptation in Deadline-Based Networks

Dynamic Load Balancing Architecture for Distributed VoD using Agent Technology

source3 Backbone s1 s2 R2 R3

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

Evaluation of traffic dispersion methods for synchronous distributed multimedia data transmission on multiple links for group of mobile hosts

13 Sensor networks Gathering in an adversarial environment

Routing and Channel Assignment for Multicast in Multi- Channel Multi-Radio Wireless Mesh Networks

Towards Scalable Delivery of Video Streams to Heterogeneous Receivers

P2Cast: Peer-to-peer Patching Scheme for VoD Service

MULTIMEDIA PROXY CACHING FOR VIDEO STREAMING APPLICATIONS.

A Joint Replication-Migration-based Routing in Delay Tolerant Networks

Optimizing Caching in a Patch Streaming Multimedia-on-Demand System

An Cross Layer Collaborating Cache Scheme to Improve Performance of HTTP Clients in MANETs

Optimal Proxy Management for Multimedia Streaming in Content Distribution Networks

Enhanced Parity Packet Transmission for Video Multicast using R-DSTC

Performance Evaluation of Mesh - Based Multicast Routing Protocols in MANET s

Keywords: Medium access control, network coding, routing, throughput, transmission rate. I. INTRODUCTION

Monitoring and Analysis

IN recent years, the amount of traffic has rapidly increased

Towards Low-Redundancy Push-Pull P2P Live Streaming

Dynamic Load Sharing Policy in Distributed VoD using agents

A video streaming technique, in which video content is played back while the video data is being downloaded from an origin server, is useful for the

Video Conferencing with Content Centric Networking

Adaptive Playout Buffering for H.323 Voice over IP Applications

Architecture for Cooperative Prefetching in P2P Video-on- Demand System

Scheduling Computations on a Software-Based Router

Transcription:

The Transmitted Strategy of Proxy Cache Based on Segmented Video Zhiwen Xu, Xiaoxin Guo, Yunjie Pang, Zhengxuan Wang Faculty of Computer Science and Technology, Jilin University, Changchun City, 130012, Jilin Province, China xuzhiwen@public.cc.jl.cn Abstract. Using proxy cache is a key technique that may help to reduce the loads of the server, network bandwidth and startup delays. Basing on the popularity of clients request to segment video,we extend the length for batch and patch by using dynamic cache of proxy cache for streaming media. Present transmission schemes using dynamic cache such that unicast suffix batch, unicast patch, multicast patch, multicast merge and optimal batch patch by proxy cache based on segmented video. And then quantitatively explore the impact of the choice of transmission scheme, cache allocation policy, proxy cache size, and availability of unicast versus multicast capability, on the resultant transmission cost. 1 Introduction We consider that video stream from a server travels through Internet to the end clients. We assume that clients always request playback from the beginning of a video. The proxy receives the client request and, if the prefix video is available in the proxy, streams the prefix directly to the client. If the video is not present in the proxy, the latter will contact the server and streams the received data from the server to the client. We, according to the popularity of video application from the clients, store the segmented videos in proxy cache, therefore enhance the byte hit ratio of proxy cache. In order to raising the efficiency of proxy cache for streaming videos, we present the video segmented cache scheme based on the popularity of video requests from the clients. We segment the video into small parts, for being cached and substituted. Suppose the minimal cache allocation unit length is b, we segment video into the multiple of b. The value of b is determined according to the startup delays in the Internet environments. In later discussion, we simply assume that the video length unit is b, therefore video unit can be independently cached and substituted. The length of the prefix cache unit is also b. According to the popularity of video requests, we cache the video segmentation of different size and insure that segmented cache can obtain maximal percentage of byte hits radio. We extend the length for batch and patch by using dynamic cache of proxy cache for streaming media. Present transmission schemes using dynamic cache such that unicast suffix batch, unicast patch, multicast patch, multicast merge and optimal batch patch by proxy cache based on segmented video.

2 Zhiwen Xu et al. 2 Proxy Cache Based on Segmented Video The f i measures the relative popularity of a video: every access to the video repository has a probability of f i requesting video i. Let λ i be the access rate of video i and λ be the aggregate access rate to the video repository.the storage vector V = (V 1, V 2,..., V N ) specifies that a prefix of length V i seconds for each video i is cached at the proxy, i=1,2,...,n. The storage vector U = (U 1, U 2,..., U N ) specifies that a segmented cache of length U i seconds for each video i is cached at the proxy, i=1,2,...,n. The length of video i cached at the proxy is V i + U i. Let C s and C p represent respectively the costs associated with transmitting one byte of video data on the server-proxy path and proxy-client path. C i (V i, U i ) is the transmission cost per unit time for video i when the length of proxy cache is V i + U i. The r i represent the stream rate of video i resource. 2.1 Unicast Suffix Batch Based on Segmented Video Unicast suffix batch using dynamic cache based on segmented video is a simple batch scheme, which makes use of proxy cache to provide immediate playback. It is designed for the unicast transmission of video from the proxy to the clients. Suppose the first video request i arrives at time 0. The proxy will immediately transmit the video prefix to the client. Unicast batch processing makes the transmission time from server to proxy be as late as possible, and ensures it the playback on the client side is continuous. That is to say, the first frame of suffix is scheduled to reach the client at time V i. The length of prefix cache which depends on the network environment, but the other length U i of cache in proxy cache is relative to the speed of customer s request for this video, that is to say, we determine the length of cache U i according to the popularization of video. For any request arriving in time (0, V i + U i ), the proxy just forwards the incoming suffix (of length L i V i U i ) to the client, and new transmitted suffix should come from server. Actually, several video suffix requests are transmitted in batch using dynamic cache. Segment-based unicast transmission and prefix cache can solve the problem of startup delays. But batch transmission using dynamic cache can increase transmission efficiency because its maximum unit is realized by dynamic cache windows and different from the others. Suppose there is a Poisson distribution process, and the average request number is 1 + (V i + U i )λ i in time [0, V i + U i ], these requests cause the transmission of suffix [V i + U i, L i ] from the server, and the average transmission cost of video i is : L i V i U i c i (V i, U i ) = (c s + c p L i )λ i r i (1) 1 + (V i + U i )λ i 2.2 Unicast Patch Based on Segmented Video Unicast patch for proxy cache based on segmented video using dynamic cache can save network resource. The first request of video i arrives at time 0, and the video non-stored in proxy cache arrives at time V i + U i from server (Fig.1).

Lecture Notes in Computer Science 3 Suppose another customer s request arrives at time t 2, V i + U i < t 2 < L i. One method is to read the video non-cached in proxy from server directly. The other is to use patch processing technique. We suppose to handle [V i + U i, t 2 ] from the patch of server, since segment [t 2, L i ] have been scheduled to transmit, the patch is set at time t 2 + V i + U i, then the client must receive at the same time from two channels and deliver the content of the suffix and patch. So a patch depends on a suffix threshold G i. Measured from the beginning of the suffix, if we begin delivering from the nearest suffix, and request the arrival within G i, the proxy will schedules a patch from the server for it. Otherwise, it starts a new complete transmission of the suffix. Suppose a Poisson arrival process, between the initiations of the two consecutive transmission of the suffix, the average number is 1+λ i (V i +U i +G i ). These requests are only in one transmission of the suffix [V i + U i, L i ] from the server, the total length of patches from the server for these requests is: Gi λ i G i x/g i dx = λg 2 i /2 (2) 0 This is because, the distribution of arrivals in time interval [V i + U i, V i + U i + G i ] follows a uniform distribution in a Poission arrival process. The average transmission cost of the video i is: c i (V i, U i ) = c s λ i r i λ i G 2 i /2 + L i V i U i 1 + λ(v i + U i + G i ) + c p λ i r i L i (3) Fig. 1: Unicast patch Fig. 2: Case 1 2.3 Multicast Patch Based on Segmented Video If the path from the proxy to the client is multicast transmission, proxy can use the scheme of segmented multicast patch using dynamic cache. Suppose

4 Zhiwen Xu et al. the request of video i arrives at time 0 (fig.2), proxy begins to transmit video V i + U i by multicast at time 0, the server begins to transmit suffix of video to the client at time V i + U i, and the proxy transmits received data to client by multicast patch using dynamic cache. If that T i is the domain value that control the transmission frequency of whole stream. Suppose a request shortly after the stream transmitting arrives at t 2 (0 < t 2 T i ), the video transmission to the clients can be classified into two cases according to the relationship between V i + U i and T i : Case 1: T i V i + U i L i (Fig.2), the client receives [0, t 2 ] segment by using unicast from proxy through a single channel, and receives [t 2, L i ] segment by using a processing multicast. Suppose this is the Poisson distribution, in this circumstance, the value transmission function g 1 (V i, U i, T i ) is: λ i r i 1 + λ i T i [(L i V i U i )c s + L i c p + λ it 2 i 2 c p] (4) Case 2: 0 V i +U i < T i ( Fig.3), if 0 < t 2 V i +U i, transmission construction is the same as case 1. If V i +U i < t 2 T i, the client receives [0, V i +U i ] segment by using unicast from proxy through a single channel, and receives [t 2, L i ] by using the processing multicast. The server transmits [V i + U i, t 2 ] through proxy to client by using unicast. Suppose its process is the Poisson distribution, then the transmission cost function g 2 (V i, U i, T i ) is: λ i r i [(L i V i U i )c s +L i c p + λ i(v i + U i ) 2 c p + λ i(t i V i U i ) 2 (c s +c p )] (5) 1 + λ i T i 2 2 We denote the cost of transmission with k(k=1,2), h k (V i, U i ) is the minimum transmission cost of the transmission value function: h k (V i, U i )=min{g k (V i, U i, T i ), 0 T i L i }. For a given V i + U i cache proxy, the average transmission value is: c i (V i, U i )=min{h 1 (V i, U i ),h 2 (V i, U i )}. Fig. 3: Case 2 Fig. 4: optimal multicast patch

Lecture Notes in Computer Science 5 2.4 Multicast Stream Merging Based on Segmented video The key issue of stream merging is deciding how to merge a later stream into an earlier stream. Closest Target policy [5] is one online heuristic policy whose performance is close to optimal offline stream merging. Our scheme for multicast Merge integrates dynamic cache and stream merging. It uses Closest Target policy to decide how to merge a later stream into an earlier stream. For a video segmentation required by the client, if a prefix of the segmentation is at the proxy, it is transmitted directly from the proxy to the client. The suffix not stored at proxy cache is transmitted from the server as late as possible while still ensuring continuous playback at the client. Let P j be the probability of requiring a j-second, 0 j L i. This can be obtained by monitoring a running system. The average transmission cost of video is: c i (V i, U i ) = V i+u i j=b jp j b i c p + L i j=v i U i+b (j(c p + c s ) (V i + U i )c s ))P j r i (6) 2.5 Batch Patch Based on Segmented Dynamic Cache Recently White and Crowecroft [8] have introduced the concept of optimal batch patch for prefix cache in order to minimize the average stream rate of the server. We present optimal batch patch using dynamic cache based on segmented video. The common clients, before applying to the server for conventional channel RM, are transmitted in batches as one interval. This interval is fixed, symbolized as V i +U i, with an optimal patch window W. RM denotes regular channel. It exceeds this window, initiating a new conventional regular channel RM is more efficient than sending the patch within V i + U i (Fig.4). A length of RM is transmitted as a startup interval of RM, while a non-rm length is transmitted as a non-startup interval. The RM length and the average interval of startup patch, it is that in the interval of two adjacent RM periods random chosen by computer. The average interval of RM length is: R = (1 P )rw 2 + (1 P )(V i + U i )rw + (V i + U i )rl 2(V i + U i )W + 2(Vi+Ui)2 1 P Here P = P (0) denotes the probability of getting 0 request in the batch transmission interval V i + U i. L is the sustained time of video, r represent the stream rate of video resource, the optimal patch window is achieved by differentiating R and letting the result to be 0 request. The result is: W = (V i + U i ) + P (V i + U i ) 2 + 2(1 P )(V i + U i )L (V i + U i )(1 P ) The optimal batch patch uses dynamic cache to transmit the request of batch and patch in interval V i +U i. The value of b for prefix cache can not be too large, generally in seconds. If the value is too large, that will extend the clients waiting (7) (8)

6 Zhiwen Xu et al. time. In proxy cache based on segmented video, by utilizing dynamic cache technique, the total length of prefix and segmented cache in the proxy cache is considered as the unit of batch transmission and patch. If t 2 < V i + U i W, there exists multiple patch streams with batch of V i + U i, and the dynamic cache, which is in proxy cache, completes processing the patch within the exact time of the patch streams, then the value from the server to the proxy becomes 1/(1+((V i +U i )λ i ). If t 2 < W < V i +U i, there exists multiple patch streams with W, and use of length W for dynamic cache in proxy cache, completes processing the patch within the exact time of the multiple patch streams, then the value from the sever to the proxy becomes1/(1+w λ). These two cases can both reduce the output from the server and enhance the efficiency of proxy cache. 3 Conclusions We present the segmented proxy cache technique based on the popularity of video, and extend the length of batch transmission by using dynamic cache. Further more, we put forward several schemes which are based on video segment such as unicast batch transmission, unicast patch, multicast patch, multicast stream merging and optimal batch patch schemes, and study the impact on the transmission cost caused by the choice of transmission schemes. The evaluation exposits that even to relative small proxy cache, elaborately designed transmission scheme, with dynamic cache based segmented video, can produce significant cost saving. Its performance is prior to the optimal prefix cache, proportional priority cache and 0-1 cache schemes. References 1. C. Aggarwal, J. Wolf, and P. Yu : On optimal batching policies for video-on-demand storage servers. In Proc. of IEEE International Conference on Multimedia Computing and Systems, June 1996. 2. K.L.Wu, P.S.Yu: Segment-Based Proxy Caching of Multimedia Streams. In: Proc. of IEEE INFOCOM, May, 2001. 3. S.Sen, J.Reforrd, D.Towsley: Proxy Prefix Caching for Multimedia Streaming. In: Proc. Of IEEE INFOCOM, Mar.1999. 4. K. Hua, Y. Cai and S. Sheu: Patching: A multicast techniquefor true video-ondemand services. In: Proc. ACM Multimedia, September 1998. 5. D.Eager, M.Vernon and J.Zahorjan: Optimal and efficient merging schedules for video-on-demand servers. In Proc.ACM Multimedia, November 1999. 6. B. Wang, S. Sen, M. Adler and D. Towsley: Proxy-based distribution of streaming video over unicast/multicast connections. In: Tech. Rep. 01-05, Department of Computer Science, University of Massachusetts, Amherst,2001. 7. O.Verscheure, C.Venketramani, P.Frossard, L.Amini: Joint server scheduling and proxy caching for video delivery. In: IBM Technical Report Number RC21981,2001. 8. Paul P.White,Jon Crowcroft:Optimised Batch Patching with Classes of Service. In: ACM Cpmputer Communication Review (2000), Vol 30.