What Is Adaptive Streaming?
[This article is part of Streaming Media's "What Is" series, providing a high-level overview and definitions of key concepts in online video.]
Executive Summary
Adaptive streaming technologies enable the optimum streaming video viewing experience for a diverse range of devices over a broad set of connection speeds. If streaming video is mission critical to your enterprise, and you’re not using adaptive streaming today, or soon to implement it, you’re already behind the curve. This document describes what adaptive streaming is, identifies the primary technology contenders, and discusses the factors that you should consider when choosing a technology.
Adaptive Streaming Defined
Adaptive streaming technologies share several critical aspects. First, they produce multiple files from the same source file to distribute to viewers watching on different powered devices via different connection speeds. Second, they distribute the files adaptively, changing the stream that’s delivered to adapt to changes in effective throughput and available CPU cycles on the playback station.
Third, they all operate transparently to the user, so that the viewer clicks one button (rather than multiple buttons as with the movie trailer experience where users select the bitrate and video quality beforehand) and all stream switching occurs behind the scenes. The viewer may notice a slight change in quality as the streams switch, but no action is required on his part.
All technologies share similar operating characteristics as well, though there are some key differences. For example, all adaptive streaming technologies monitor factors like video buffer status to assess effective throughput and CPU utilization and dropped frames to assess the available computing power on the playback station. This information is used to determine when to switch streams.
For example, if the video buffer is full and CPU utilization low, the adaptive streaming technology may switch to a higher quality stream to enhance the viewing experience. If the buffer drops below certain levels, or CPU utilization spikes above certain thresholds, the technology may switch to a lower quality stream.
The key implementation difference between the technologies is the involvement of a streaming server. Specifically, some technologies require a streaming server, and constant communication between the server and player. If a stream switch is required, the server implements it by sending a different stream to the viewer.
Other technologies operate without a streaming server. The different quality streams are posted to different addresses on a web server or multiple web servers. The player monitors operating heuristics like CPU utilization and buffer status, decides when a stream switch is necessary and starts retrieving data from a different stream when appropriate.
Either way, adaptive streaming technologies enable producers to deliver outstanding quality streams at the high end of the bandwidth/power spectrum because they also serve the low end. Without adaptive streaming, most producers would either distribute a single mid-quality file that looks below average in the optimum viewing configuration, or create multiple files and force the viewer select the desired configuration.
Adaptive Streaming Vendors and Service Providers
The players fall into three general categories: technology developers, service providers and standard-based technologies. Prominent technology providers include Adobe with Flash-based Dynamic Streaming, Apple with HTTP Live Streaming (HLS), and Microsoft with Smooth Streaming for Silverlight. Move Networks, which pioneered this market, has largely dropped out as a general purpose technology provider. Several WebM-based HTML5 options are also under development or available, including technologies from Anevia and Quavlive.
Service providers include primarily Akamai, with its Akamai HD Network, which is a platform that can deliver to iOS devices, Flash and Silverlight clients. Several companies, most notably Netflix, have developed their own adaptive streaming technologies for internal use.
Standard-based technologies include Scalable Video Coding (SVC), which is an extension of the H.264 specification. In addition, Apple has submitted the HTTP Live Streaming protocol to the Internet Engineering Task Force (IETF) where it is working its way through the standardization process.
Choosing an Adaptive Streaming Technology
Here are the primary factors that you should consider when choosing an adaptive streaming technology.
Supported Playback Platforms
Intuitively, the most important consideration is whether a technology can reach your target viewer. Starting with computer-based playback, with HTML5-compatible browser penetration still well below the 60% range (according to netmarketshare.com), this means either a Flash or Silverlight-based solution. In this regard, while Silverlight’s penetration over the last few years has been growing, at about 71% worldwide penetration (according to www.riastats.com) it’s still far behind the 97% Flash penetration reported by the same site.
If you’re distributing premium content, like the Olympics or Sunday Night Football, you may be able to assume that viewers will download and install the Silverlight client to watch your video. For more prosaic content, this assumption may be harder to make.
If mobile is an important target market, you’ll have to implement at least two adaptive streaming technologies, since neither Flash nor Silverlight can play on iOS devices or most other mobile platforms. Android developer Google has helped things along by implementing HLS on the Android 3.0 platform, and hopefully other mobile platforms will fall into line.
However, HTTP Live Streaming is a non-starter on traditional computers because it requires the QuickTimeX player, which is only available on Snow Leopard, and not at all on Windows. Though there are some third party solutions in the works that enable HLS playback on Windows, the penetration of these players would have to become very significant before most streaming producers would consider abandoning Flash or Silverlight for HLS on general computers.
Fortunately, supporting multiple adaptive streaming technologies is not as formidable a task as it once was, as multi-platform capabilities like those offered by Akamai are becoming more mainstream, whether through a service provider like Akamai or via third party streaming servers. For example, Wowza Media Server 2 can input a single stream of H.264-encoded video in multiple formats, and dynamically transmux the container format and protocol to deliver to Flash, Silverlight, iOS devices and other platforms.
You can buy the Wowza Media Server yourself, or choose an online video platform (OVP) or content delivery network (CDN) that uses the Wowza technology, or other similar products, to offer these capabilities to third parties. In addition, at NAB 2011, Adobe previewed the ability for the Flash Media Server to deliver to iOS devices, while Microsoft’s IIS Media Services can transmux an incoming Smooth Streaming video streaming for iOS delivery.
Which Protocol Does it Use?
When originally released, Adobe’s Dynamic Streaming exclusively used the Real Time Messaging Protocol (RTMP) to distribute video to viewers, which has multiple disadvantages as compared to technologies that use the more general Hypertext Transfer Protocol (HTTP), which include HLS and Smooth Streaming. First, as a server-based technology, the initial implementation of Dynamic Streaming required a persistent connection between server and player, which potentially can increase implementation cost and limit deployment scalability.
Second, RTMP packets may have difficulties getting through certain firewalls, though the Flash Media Server has workarounds if these problems are experienced. Third, video packets delivered via HTTP can more easily leverage standard HTTP caching available within the networks of ISPs, corporations, and other organizations, which can improve distribution efficiency and quality of service. Finally, there’s a general perception among technical cognoscenti that stream switching works more effectively using the chunk-based delivery used by HTTP-based adaptive streaming technologies.
While these arguments create a strong case that RTMP is a tragically deficient protocol for delivering adaptive streaming, to a degree that’s akin to arguing that air is bad for mammals or water bad for fish and crustaceans. That is, if you take large-scale events out of the picture, the overwhelming bulk of actual streaming video (as opposed to progressive download) is delivered via RTMP.
For example, the Wall Street Journal, Bloomberg and the Financial Times all distribute Flash video via RTMP, which presumably they wouldn’t use if the videos were rejected en masse by the firewalls used by many of their tony viewers. MTV and CBS also stream via RTMP, which should put issues about scalability and cost to bed, as well as the benefits of the caching of HTTP packets.
Overall, however, what finally put the RTMP vs. HTTP debate to bed was Adobe announcing and shipping an HTTP version of Dynamic Streaming. So now if you want to deliver to the Flash Player via HTTP, you have an Adobe option.
DRM
The final consideration, at least for some producers, is the availability of digital rights management (DRM) features to protect their content. In this regard, Adobe offers Flash Access and other technologies, Microsoft offers PlayReady content protection and while HLS doesn’t support DRM, the specification does enable encryption, and other HTTP technologies can be used to limit access to content, like HTTPS authentication.
There are the high-level concepts, while the devil is in the details when it comes to deploying DRM. Just be sure to check the availability of DRM early in the process if it’s critical to your monetization strategy.
Conclusion
Adaptive streaming may be the single most important capability for delivering the optimum quality of service to a diverse range of viewers. As usual, however, not all technologies are created equal and there is no one-size-fits-all solution. If streaming is mission critical to your enterprise, you need to be considering a solution or combination of solutions that best meets your needs.
Related Articles
When Netflix streamed the Paul/Tyson fight to a record-breaking 65 million simultaneous viewers, it wasn't all smooth sailing. Many viewers were left frustrated by buffering, freezes, and audio-sync issues. Like many other vendors, Broadpeak, a major player in streaming technology, claimed its multicast ABR (mABR) technology would have "knocked out video freezes" during the broadcast. It's a tantalizing promise, but could mABR realistically have delivered it? To understand, let's look at what mABR is, what's required to make it work, and why its widespread adoption is more complicated than Broadpeak might have you believe.
05 Dec 2024
As much as we love FFmpeg for transcoding operations, it can get frustrating when packaging your content for ABR delivery. By packaging, I mean formatting and segmenting your media files, creating manifest files for HLS and DASH, formatting for CMAF, and managing multiple audio and subtitle streams. Fortunately, there are easier-to-use solutions that are equally open source and equally free. In this article, I'll focus on GPAC, which is a great packaging alternative.
18 Mar 2024
A high-level view of state of streaming media in 2019, and the concepts and and technologies that make it work.
12 Feb 2019
HTTP, RTSP, RTMP, Smooth Streaming, HLS, HDS, and more: What exactly are streaming protocols, and how do they interact with other communications protocols?
22 Aug 2012
Jan Ozer's Streaming Media West presentation on adaptive bitrate can turn casual encoders into instant experts.
30 Jan 2012
MPEG DASH is the latest hot topic in the online video space. Here we break down what it is, and what its implications might be for video delivery in the future.
22 Nov 2011
Webinar attendees learned the finer points of streaming adaptive bitrate video.
28 Oct 2011
Apple's HTTP Live Streaming (HLS) protocol is the technology used to deliver video to Apple devices like the iPad and iPhone. Here's a primer on what HLS is and how to use it.
14 Oct 2011
While the playing field might seem complex, using a common codec actually makes desktop and mobile encoding simpler.
21 Jul 2011
Watch this workshop from Streaming Media East to learn adaptive streaming for Flash, iOS, Android, and Silverlight.
10 Jun 2011
This master class from Streaming Media East goes into detail on the formats and specs needed to reach Apple mobile devices.
06 Jun 2011
A high-level view of streaming media technology, history, and the online video market landscape
26 Feb 2011
Companies and Suppliers Mentioned