MIUN HLS Player - Proof of concept application for HTTP Live Streaming in Android 2.3 (October 2011) Jonas Bäckström Email: joba0702@student.miun.se Johan Deckmar Email: jode0701@student.miun.se Alexandre Perez-Boutavin Email: alpe1008@student.miun.se Abstract Currently, there exists few - if any - good media players for Google s mobile operating system Android capable of handling Apple s new HTTP Live Streaming (HLS) format. In this paper we reveal how this can be implemented on Android 2.3 (and higher) and what difficulties were met throughout this research project. We also discuss around the topics of buffering and stream quality management. Index Terms Android 2.3, HTTP Live Streaming, HLS Player, MPEG Transport Stream I. INTRODUCTION In this article we present background and an implementation of HTTP Live Streaming (HLS), a video streaming protocol recently proposed by Apple Inc. Our implementation targets Android 2.3, whereas Android 3.0 and above natively support HLS streaming. The research project tried to fulfill the following demands and requirements. A. Functional requirements Allow the user to switch to a desired quality depending on its bandwidth. Provide seamless transition from one quality to another. The user should be able to control the video player (mute, traverse, pause/resume the clip). B. Non-functional requirements Implemented on Android 2.3. The media player should support HTTP Live Streaming (HLS). Permission to make digital/hard copy of part or all of this work for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage, the copyright notice, the title of the publication, and its date appear, and notice is given that copying is by permission of the ACM, Inc. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. c 2011 II. RELATED WORK There already exists a few SDKs/players for the Android operating system which handles HTTP Live Streaming (HLS) though they mainly target Android version 3.0 (Honeycomb). A. Nextreaming SDK & NexPlayer In 2010 the mobile multimedia solution provider Nextreaming released an SDK for Android supporting the new HTTP Live Streaming (HLS) format. Their product line also include a video player called NexPlayer(TM). The video player implements the IETF Internet-Draft "HTTP LIVE STREAMING draft-pantos-http-live-streaming-04" protocol and uses H.264 BP as video decoder and the following audio decoders: AAC- LC, HE AAC, HE AAC v2, MP3. In addition it also has a built-in support for 128-bit AES encryption and adaptive bit rate. [1][2] B. RealNetworks SDK & RealPlayer At the IBC conference 2010 RealNetworks (one of the leading digital entertainment services company s) announced that - as the first android mobile player in the world - their main product RealPlayer now supports HTTP Live Streaming (HLS). RealNetworks also released an Android SDK by the name Helix. All standard-based codecs (including MPEG4, H.264, H.263) as well as the standard-based protocols (RTSP, RTP, SDP, HTTP, HTTP Live) are supported. Furthermore it includes support for AES and Verimatrix DRM encryption. [3][4] A. HLS Background III. ABOUT HLS HTTP Live Steaming or HLS is an HTTP-based media streaming protocol developed by Apple Inc. and released as a draft in November 1, 2009. Even if it is natively implemented in all Apple products, HLS tends to be integrated to some other platforms media players (VLC recently released a beta version to support HLS and Android 3.0 can handle it from now on). [5][6][7][8][9]
Furthermore, HLS provides content protection in the media stream segmenter which can individually encrypt each media file using AES-128 encryption method with a 16-octet key. [12] Fig. 1. Basic HLS configuration. B. HTTP Live Streaming Architecture To view a video, a client needs to download, through HTTP, the related.ts files held in a global container (a.m3u/.m3u8 playlist). Those files are commonly stored on a web server at the clients disposal (see Distribution-side on the figure above). On the Server-side, the original inputs are encoded in MPEG H.264 for the video and AAC for the audio, they are then packaged into a MPEG Transport Stream. From there, the segment encoder divides it into smaller chunks of equal duration (10 seconds is recommended) to finally generate the MPEG transport stream (.ts) files and an index file (the.m3u8 playlist) with the references to each media files. [10][11] Here is a typical example of a.m3u8 playlist: #EXTM3U #EXT-X-TARGETDURATION:10 #EXT-X-MEDIA-SEQUENCE:1 http://my.media.server.com/filesequence1.ts http://my.media.server.com/filesequence2.ts http://my.media.server.com/filesequence3.ts #EXT-X-ENDLIST In addition, HLS allows a client software to adapt the bit rate of the video according to the client s bandwidth. Different bit rates can be assigned for alternative streaming files. Therefore, for the same content, a playlist can be created with a link to those alternative streams. [10] An example of how such a playlist can look like: #EXTM3U #EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=200000 http://my.media.server.com/low_quality/prog_index.m3u8 #EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=500000 http://my.media.server.com/medium_quality/prog_index.m3u8 #EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=800000 http://my.media.server.com/high_quality/prog_index.m3u8 IV. QUALITY-SPECIFIC BUFFERING AND PLAYBACK The first step in the buffering process is fetching the m3u8 playlist file which we proceed to parse into useful a structure. Each URL is stored together with the bandwidth and later used for switching between different stream qualities. When the user decides to change quality via the menu, the bandwidth-equivalent URL is sent as the new source from which the buffer should continue reading. E.g. the buffer is targeted at example_stream_1, has recently sent package #10 to the presentation layer (media player) and is currently holding package #11-14. We then proceed to set a new target (example_stream_2), flush the buffer and start reading from package #11 in the new quality. If the connectivity of the mobile device is sufficiently good the user will receive a seamless switch in quality without the video pausing. A. Buffering architecture The application uses an always-buffer-n-ahead strategy for buffering the TS-stream, which - compared to a standard buffer and read strategy - always aims at keeping N packages in the buffer. Presuming that the mobile device that runs the application has good enough connectivity, we can - by using this strategy - ensure that the presentation layer (media player) receives an even flow of data without interruptions. Fig. 2. Illustration showing a buffer taking one TS-package at the time, feed it to the presentation layer (media player) and then - once the presentation layer has consumed the file - start buffering the next. Fig. 3. llustrates the always-buffer-n-ahead strategy, were we always try to keep N TS-packages in the buffer. At the same time we feed one package to the presentation layer (media player), we start buffering the next package in line. B. Buffer-related problems For the buffer-n-parts-ahead strategy N=3 has been chosen in our implementation, resulting in a 30 second buffer. Each
time a buffered file has completely sent its data through the local connection to the VideoView the next video file will in turn start downloading the data that will be displayed 20 seconds in the future. The first assumption was that a change in the video quality would take between 20 and 30 seconds to propagate to the video player, or up to 10 seconds if the buffer was discarded. Instead a problem arose; the video player is unable to switch to the new bit rate video within the same stream. To solve this problem, and also the issue of a large uncontrollable cache in the video player itself, the video player is again sent a local URL to stream from and the local proxy will switch over to buffer the newly set quality only. This introduces about a couple of seconds of black pause in playback and jump in time when switching quality - where the pause is mostly caused by buffering in the video player. V. ARCHITECTUAL OVERVIEW The architectural components of the application consist of a graphical interface and a separate stream proxy which manages parsing of the HLS structure and video caching. These parts sends event notifications between each other and, when ready, stream the cached video using a local TCP/IP connection. VI. RESULTING APPLICATION The application works in both portrait and landscape mode and scales well with different screen sizes and resolutions. At start up the user is presented with an non-interactive screen with a different version of the application logo depending on the orientation. If the user presses the menu button he is shown three different choices (Change stream quality, open URL and exit application). Pressing the quality button the user is presented with the different stream quality s available. If a choice - different then current - is made, the application will switch to the new quality. Choosing the URL option opens a input were the user can enter a link to a new m3u8 file. Once the video is playing the user can interact with the player by clicking the screen. This will in turn open a floating menu with options for pausing/resuming, traversing and muting the sound in the clip. The application can be terminated by either clicking the Exit option in the menu or by simply pressing the return button on the mobile device. Fig. 5. Initial view in portrait orientation. Fig. 4. Example flow of events when opening an HLS stream Internal to the stream proxy are objects that download, store, stream and later on delete the video data of each file in the HLS stream structure. These buffering objects each do work on their own thread and can notify the proxy manager when they have successfully cached their video. Such a notification from the very first video file is used to trigger the VideoView in the GUI to start streaming locally. The VideoView operates on a URL pointing to a server locally on the device, managed by the stream proxy. When the VideoView connects the proxy, it will first respond with a HTTP header and then proceed to pipe the video data from the buffering objects in sequential order. In case the Internet connection does not allow us to cache video data at least as fast as the playback rate, the local proxy will stall for awhile. Because of the relatively large internal buffer of the VideoView, this has yet to cause the video playback to halt. Fig. 6. View when opening a playlist.
Fig. 7. View when streaming a video in portrait orientation. Fig. 10. View when choosing the quality. Fig. 9. Fig. 8. Initial view in landscape orientation. View when watch a video in landscape orientation. VII. FUTURE WORK The next step for this implementation would be to automatically and seamlessly switch to a video bandwidth supported by the current connection. To automatically switch would require measuring the mean bandwidth of the video files being downloaded and switch to a stream accordingly. To seamlessly switch requires complete control of the video player buffer. Once we know which bandwidth can be streamed fast enough there is no problem to start buffering that stream - but in order not to disturb the user, the video player must be able to instantly play that new bandwidth when the switch is made. It is unacceptable that the player shows a black screen while internally buffering many tens of seconds of video when this work has already been done for it. To achieve such control over the buffering in the player its code would need to be forked and modified into a player that has a minimalistic buffer strategy. As most standard components in Android, the video player is build part using Java and part C code. VIII. CONCLUSIONS At the end of the implementation, all the main goals were meet. The final prototype is a media player application implemented on Android 2.3 capable of playing HLS. The user can open a URL linking to an HLS playlist and switch seamlessly between the different available qualities of the media content thanks to our buffer management. We conclude that there are no obstacles stopping us from constructing a fully functional HLS player for Android 2.3, most importantly since the built in player natively supports the codecs dictated by the proposed HLS standard - H.264 and AC3 - and the MPEG-2 Transport Stream (TS) protocol. This raises the question of why Google Inc. decided to natively support HLS files only in Android 3.0 and above. The authors suggest the reason behind this is that the new graphical framework, such as the classes Fragment and ActionBar, are used in the components around the new HLS player from Google - making it backwards incompatible with Android 2.3 and below. ACKNOWLEDGMENT The authors would like to thank Dr. Ulf Jennehag, Mid Sweden University for suggesting this topic as a research project and for guidance along the way. REFERENCES [1] Nextreaming, Nextreaming releases HTTP Live Streaming (HLS) Player SDK for Android, Published: Sept 16, 2010, IBC conf. Amsterdam, http://www.prlog.org/10935063-nextreaming-releases-http-livestreaming-hls-player-sdk-for-android.html [2] Nextreaming, NexPlayer(TM), http://www.nextreaming.com/product/nex player.php [3] RealNetworks, RealNetworks Gives Handset and Tablet OEMs Ability to Deliver HTTP Live Content to Android Users, Published: 2010, IBC conf. Amsterdam, http://www.realnetworks.com/pressroom/releases/2010/realplayerfor-mobile-delivers-http-live-content-to-android.aspx
[4] RealNetworks, Helix DNA - The Building Blocks for Digital Media, Published: 2010, http://www.realnetworks.com/uploadedfiles/helix/resourcelibrary/datasheet-helix-dna.pdf [5] Apple, HTTP Live Streaming (HLS) - Documentation and best practises, http://developer.apple.com/resources/http-streaming/ [6] Apple Inc., HTTP Live Streaming Overview - Introduction, http://developer.apple.com/library/ios/#documentation/networkinginternet /conceptual/streamingmediaguide/introduction/introduction.html [7] IETF, HLS Draft 00, http://tools.ietf.org/html/draft-pantos-http-livestreaming-00 [8] Anevia, BETA VLC Media Player for HLS, http://demo.anevia.com:8080/ott/vlc.php [9] Android Developers, Android 3.0 Platform Highlights, http://developer.android.com/sdk/android-3.0-highlights.html [10] Apple Inc., HTTP Live Streaming Overview - HTTP Streaming Architecture, http://developer.apple.com/library/ios/#documentation/networkinginternet/ conceptual/streamingmediaguide/httpstreamingarchitecture/httpstrea mingarchitecture.html#//apple_ref/doc/uid/tp40008332-ch101-sw2 [11] Nullsoft Inc., The M3U Playlist format, originally invented for the Winamp media player, http://wikipedia.org/wiki/m3u [12] Apple Inc., HTTP Live Streaming Overview - Using HTTP Live Streaming - Content Protection, http://developer.apple.com/library/ios/#documentation/networkinginternet/ conceptual/streamingmediaguide/usinghttplivestreaming/usinghttp LiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-DontLink ElementID_22 [13] Apple, HTTP Live Streaming Draft, R. Pantos Ed., Version 07, Published: Sept 30, 2011, http://tools.ietf.org/html/draft-pantos-http-livestreaming-07 [14] Google, Android source code - android.widget.videoview, http://grepcode.com/file/repository.grepcode.com/java/ext/com.google. android/android/2.1_r2/android/widget/videoview.java [15] Google, Android source code - android.widget.mediaplayer, http://grepcode.com/file/repository.grepcode.com/java/ext/com.google. android/android/2.1_r2/android/widget/videoview.java