-->

Simultaneous Multiformat Encoding

Article Featured Image
Article Featured Image

Distributing web content to multiple browsers was once the primary content delivery challenge for streaming media producers. But the proliferation of set-top boxes and mobile devices have added even more complexity to the delivery of video.

Even for content owners who pay a content delivery network (CDN) to deliver to various platforms, the need to understand how to best encode content for these various viewing platforms may be the difference between being a leader and being an also-ran in today’s competitive and increasingly fragmented media environment.

Traditional broadcast production and distribution platforms used a limited range of formats, such as MPEG-2. But today’s multiplatform world is usually not so consistent. Beyond the expanding breadth of viewing devices, the equal proliferation of encoding parameters—bitrates, codecs, and container formats—means that content owners could face significant duplication of effort to push content to every customer.

This article is meant as an introductory-level teaser of sorts. Even playback on a single distribution platform, such as web video on a personal computer, leads to an array of additional considerations and options. Also, in the interest of introducing some general concepts, we’ll overlook a few of the deeper details (for example, metadata handling) and leave those for a separate discussion.

As for terminology, for the purposes of this article "encoding" will refer to the creation of a live stream or file from a live or tape-based digital or analog video source, and "transcoding" will mean the conversion of file-based media from one container format or codec to another. Likewise, "deliverables" or "delivery" will refer to either live streams or encoded files for subsequent on-demand streaming or progressive download viewing on multiple device platforms. It’s worth nothing that the breadth of compression and file formats used in file-based production, encoding, and distribution far exceeds that used for live real-time streaming delivery.

Convergence of Codecs and Containers
One bright spot in all of the complexity that exists around delivery to multiple platforms is the continued convergence of a few key video compression and container formats.

One example of this is H.264, a video compression standard also called AVC or MPEG-4 Part 10. While H.264 originated as an effort within MPEG-4 development that was intended to replace MPEG-2, H.264 video compression can be used in a variety of container formats, including QuickTime MOV, Adobe Flash F4V, and the same Transport Stream wrapper often used for MPEG-2 content.

A container format is often nothing more than a "wrapper" in which compressed video and corresponding encoded audio are delivered to a player application. Some player applications will not play particular container formats, even if the content is encoded with a proper combination of audio and video codecs, so simply changing the container format can sometimes allow previously encoded content to be repurposed for use in additional video players. In some cases, however, a player will support a limited range of frame sizes, bitrates, or even GOP (group of picture) structures within a compression format, requiring the content itself to be transcoded.

Common technologies for powering web viewing experiences include Microsoft Silverlight, Adobe Flash, and Apple QuickTime. While each technology supports additional compression formats (for Silverlight, Windows Media and the SMPTE standard VC-1 based on it; for Flash, On2’s VP6 codec first introduced with Flash 8), all three web technologies support video encoded with H.264 compression—providing common ground with current mobile and IPTV platforms. For web-based and newer mobile phone-based interactivity, Adobe Flash or Microsoft Silverlight technologies provide additional interactivity integrated into the video viewing experience.

For web and mobile viewing, the inclusion of adaptive bitrate playback in web-based video players is a welcome addition. The video source is encoded in multiple bitrates, and the players dynamically adapt to variations in consumer bandwidth to provide the optimum viewing bitrate for any given 2–10-second increment. These technologies include Microsoft IIS Smooth Streaming, Adobe Dynamic Streaming, and Apple’s adaptive delivery to the Apple iPhone.

Each of these targets has its own unique characteristics, and no one set of encoding specifications comprehensively and optimally serves them all. For any multiplatform distribution strategy, encoding to multiple formats, resolutions, and bitrates is an unavoidable requirement. Factor in nondistribution formats for acquisition, production, and archive, and the types of deliverables number in the dozens.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

A Buyer’s Guide to Multiformat Media Servers

4K and DASH muddy the waters, but the need for servers that can deliver multiple formats and handle closed captioning and ad insertion has never been clearer.