-->

Collaboration Is Critical to Reduce Latency in Online Video

Article Featured Image

When we think about what’s happening in the video space—the gradual transition from broadcast to online distribution—one comparison between TV and online video always comes to mind: “broadcast quality.” I know that I write and talk about this comparison until I’m blue in the face, but here it is again: Online video needs to meet the expectations people have developed for the video experience by watching broadcast television over the past 30 years. But we most often equate broadcast quality with the end user. So the focus of content owners, distributors, network operators, and over-the-top (OTT) providers has been on improving buffer rates, reducing in-stream failures, eliminating play failures, and making engagement (like how long a user watches a video) better. Technology providers such as Conviva, IneoQuest, and Nice People at Work are all making their living by focusing on this aspect of the problem.

But that focus has left a host of inefficiencies and latency in other parts of the value chain— encoding, packaging, protection, delivery, and more. Throughout the workflow, things are slower than they probably need to be as content moves from acquisition to playback. Some of that slowness is a consequence of the physical hardware (routers, switches) and the public internet, while other degradation is a result of poorly cobbled-together solutions.

Think about it. The television industry has had decades to mature, to build workflows that work seamlessly and employ technology that is based on ratified standards mandated by governmental agencies. Online video, on the contrary, is only a fraction of the age and has no guidance. Companies will solve technical streaming problems with duct tape and bubble gum if necessary, focusing more on just getting the workflow (with all its disparate components) to work together in some fashion rather than worrying if it’s the best way to accomplish the end task.

We can call this ugliness the “back end” of broadcast quality. Failing to optimize it means, ultimately, failing to provide consumers with the kind of experience they expect because the back end impacts the front end. Whether a user has to wait a few extra seconds for just-in-time packaging or an extra hour for content to become available because of workflow inefficiencies, latency in the workflow and value chain have adverse effects on the entirety of the experience. People don’t have to wait for TV.

In order to squeeze latency out of the workflow, online video technology companies need to rally around collaboration. Failing to expose APIs or enable third-party companies to work with platforms and technologies just promises to slow down the video value chain even more. Instead of focusing on how to optimize front-end key performance indicators, perhaps OTT providers should look at how to optimize the interoperability between the different vendor technologies that comprise its publishing and delivery processes. By turning attention away from the front end and toward the back end, content distributors can probably squeeze a considerable amount of wait time out of the system while also improving the overall integrity, resiliency, and operation of their platforms.

It’s not hard to make the argument that we need standards. The online video industry needs guidance, clarity, and best practices. But collaboration is just as important as the documentation to build high-quality, consistent streaming services. At the Streaming Video Alliance, for example, competitors come together in an effort to optimize the value chain, balancing their own needs against the much broader needs of the industry as a whole. This kind of collaboration is critical to reducing that latency because without the intention by technology companies to work together, content distributors will be left with Frankenstein systems woefully in need of optimization. And that will mean more latency in the back end of broadcast quality.

For the good of the industry, vendors must collaborate better. And when that happens, there’s a far better chance that online video will provide the broadcast-quality experience that consumers expect.

[This article appears in the October 2016 issue of Streaming Media magazine as "Collaboration Is Critical."]

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Latency: The Final Frontier for Streaming Video Entertainment?

Delays of up to two minutes can really destroy the live sports experience. Walled garden solutions aren't working, so it's up to CDNs to provide relief.

Beyond TCP: Meet the Next Generation of Transport Protocols

Lower latency while increasing reliability: That's the promise of alternate transmission protocols that expand on TCP or simply replace it as the streaming transmission champ.

Video: Do You Really Need Low-Latency Streaming?

Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.

Video: Is Latency Still Hurting Live Streaming?

Ooyala's Paula Minardi and Level 3's Jon Alexander discuss the key issues facing live streaming and VOD providers regarding latency times, buffering, and meeting evolving viewer expectations.

Latency Sucks! So Which Companies Are Creating a Solution?

It's one of the biggest challenges facing live video streaming today, and it's remained frustratingly stubborn. Explore the technical solutions that could finally bring latency down to the 1-second mark.