Multiple Delivery Mechanisms for Streaming
To stream or not to stream? That, often, is the question.
What’s a content creator to do? Go with live streaming to attract the most timely audience, but pay extra to install T1 or fiber for infrequent live events? Capture content locally without a live stream and then rebroadcast it numerous times? Encapsulate all the content together into a single .zip file and allow users to download and keep the content on their local machines? Use progressive downloads and force viewers to wait an average of 30 seconds to begin viewing, with the hopes that bandwidth will meet or exceed playback speed to eliminate glitches in the content?
Today’s streaming media content delivery models often operate as if it were still 1999. The delivery models are still typically based on a best-effort approach that attempts to deliver content at a constant data rate and does little to account for extreme bandwidth fluctuations, especially additional bandwidth availability. At the same time, these streaming models require content producers to capture and deliver content in "best guess" scenarios that are often either too high or low for either the dial-up or broadband audience.
To make matters worse, these scenarios require the producer to perform additional steps for on-demand delivery, creating numerous versions of the same content, which in turn requires larger storage and content management systems.
Fortunately, several new technologies—combined with several almost-forgotten techniques and a few too-hastily abandoned technologies—provide hope that emerging rich media delivery will meet users’ viewing expectations [See "Brave New Interfaces," pp. 60]. Let’s look at a few of these technologies and how they might enhance future delivery models.
Peer-to-Peer
The peer-to-peer model has two major flaws that have limited its effectiveness for streaming media delivery. The first was the legal issue; few streaming companies wanted to invest in a peer-to-peer model during the height of the RIAA/MPAA inquisition and mass lawsuits for fear of being lumped in with a P2P crowd that bore the "illegal download" stigma. The legal issue was temporarily resolved in a 2004 ruling that absolved creators of peer-to-peer software of any liability for their end users’ actions. The recent Supreme Court ruling in the Grokster case actually only muddied the waters, saying that peer-to-peer networks can be held liable only if they promote the copyright-infringing capability.
A greater obstacle than the legal issue, however, was the technical issue. Peer-to-peer was not as efficient as its proponents claimed, at least in its pre-BitTorrent manifestations. Previous peer-to-peer software was slower than a good content delivery network in side-by-side comparisons, primarily because a peer-to-peer download required an extensive search for other peers that had the same file. And when they were found, prior to BitTorrent, peer-to-peer required content to be downloaded in a linear fashion. BitTorrent has solved that "whole file" issue by looking for peers with any part of the file and then aggregating those bits and pieces on the local machine, where they are stitched back together. The local machine also acts as a "seeder," beginning uploads to other peers as soon as the download begins (previous peer-to-peer software did not require uploads in order to download content). Once a BitTorrent session begins, speeds increase dramatically, filling as much bandwidth as required (or allowed). Kontiki’s Grid Delivery Technology solution is one successful deployment of a BitTorrent-like approach to content distribution.
BitTorrent goes only so far, however; live streaming has yet to be proven effective via peer-to-peer transmissions. Even for on-demand, two things must occur to allow a peer-to-peer network to be used effectively. First, a sizable portion of the beginning of an on-demand stream should be available on as many peers as possible. Ideally, this portion of the file would constitute at least 30 seconds worth of content to compensate for the lag in time required to find the optimum number of peers to accelerate the rest of the download. Second, and more importantly, this content should be downloaded first, so that the local machine can begin to play a portion of the content before the total file is streamed or downloaded.