Shared publicly  - 
How to Implement HLS for Google TV

Http Live Streaming[1] aka HLS[2] is a standard for streaming multimedia content (Audio and Video) supported by Google TV.

There are many cool features that come with HLS. The main ones are:
1) Adaptive Streaming - Automatically adapts to either congestion or bandwidth availability.
2) Resilience to transient network failures.
3) No special configuration for your server, routers, or firewalls. It’s just HTTP 1.1
4) Easily supported by Content Delivery Networks
5) Live streaming is supported (more in a longer article)
6) HTML5 video tag support in Chrome for Google TV.
7) Optional AES encryption (more in a longer article).

On Google TV, HLS is a standard protocol, you just put your url in any of the Media Playback API’s such as MediaPlayer, VideoView, etc. It just works.

Components of a HLS file
.m3u8 - Text based manifest or playlist file (may be updated for live content) - a variant playlist usually points to individual manifests that also end in .m3u8
MIME Type: URL or application/x-mpegURL

.ts - MPEG 2 Transport Stream - Typically 5-10 seconds long video & audio data.
MIME Type: video/MP2T

Creating content for HLS
The easiest way to create HLS content is to use Apple’s tools[3], the latest version of Sorenson Squeeze, Telestream’s Episode, and many cloud encoding providers. You start with content you encode at many bit rates. For Google TV, the first line in the .m3u8 file is the speed we start with. It’s probably best to pick the 1.2mbps stream.

Google TV supports HLS protocol version 3 as of Google TV firmware version 3.2.

Your content URL’s must have the characters “.m3u8” within the URL. If the URL doesn’t end with “.m3u8”, the system will make at least 2 requests before playback and the MIME type of the playlist must be one of “application/” or “application/x-mpegurl”.

Note - Google TV doesn’t not currently support codec switching - so Ad segments must use the same encoding as the main content. Of course, developers can pause the HLS playback, play some other content, then resume the HLS playback again to get around this.


Encoding content is as much an art as it is a science. The best choices are very much dependent on your content, what speed objects move against the background, and many other items that are too numerous to go into a simple post[6]. It is also dependent on the devices you are targeting. The settings below are designed to be optimized for Google TV. Older devices[5] may require different / additional encodings. Be aware that certain types of encoding for commercial purposes may require a license and/or the payment of royalties.

Audio Encoding should be consistent across all streams. HE-AACv1, HE-AACv2, AAC-LC up to 48kHz, stereo audio are all acceptable choices.

16:9 Aspect Ratio[7]
Dimensions Total Bitrate Video Bitrate Encoding
640x360 640 600 HiP, 4.1
640x360 1240 1200 HiP, 4.1
960x540 1840 1800 HiP, 4.1
1280x720 2540 2500 HiP, 4.1
1280x720 4540 4500 HiP, 4.1
1920x1080 6040 6000 HiP, 4.1
1920x1080 8196 8156 HiP, 4.1

4:3 Aspect Ratio
Dimensions Total Bitrate Video Bitrate Encoding
640x480 640 600 HiP, 4.1
640x480 1240 1200 HiP, 4.1
960x720 1840 1800 HiP, 4.1
1280x960 2540 2500 HiP, 4.1
1280x960 4540 4500 HiP, 4.1

The current Google TV implementation only uses the Keyframe at the beginning of each segment (For a 10 second segment at 30fps this would be every 300 frames). The Apple suggestion is to have a Keyframe every 90 frames. (or every 3 seconds at 30fps) Note - Framerate[8] is a complex subject.

[7] Adapted from [5]
Dave Morgan's profile photoAlexander Kolychev's profile photoClinton Gallagher's profile photoYossie TV's profile photo
Is it possible to use audio codec other than AAC for ex. AC3 inside HLS stream?
Great post. This is the most detail I've been able to find regarding HLS on Android yet.

Developers should note that Apple's segmenter won't necessarily cut on an even GOP boundary. You'll have to be careful with your settings to ensure the segments start with a keyframe.

I'm curious to know if non-TV Android (Honeycomb and ICS) have similar restrictions.
Its true and I have to acknowledge much being done well -but- why not go all the way and just stop being @ssholes that only support Java and the Android SDK? Why not C# and Objective-C? Why not an emulator for Windows and Mac developers?
"Your content URL’s must have the characters “.m3u8” within the URL" - Is there any reason why ".m3u" files [or content-type ""audio/mpegurl"] wouldn't work, as per the standard?
Add a comment...