-->

Two Quick-Fix Solutions That Became Long-Term Problems

Article Featured Image

Our industry is prone to taking a quick-fix approach to solving short-term issues, inadvertently (or brilliantly, depending on whom you talk to) locking the technology into a “just good enough” solution for years or even decades. Oftentimes, the solution is a de facto one, meaning that one company or a small handful owns a key patent (or group of patents) to a particular technology that is neither an international standard nor backwards-compatible to international standards. This approach hobbles the industry’s long-term innovation.

Over the past 2 years, I’ve had a chance to work with some very innovative compression technologies, which appear to be hampered by the industry’s reliance on two long-in-the-tooth de facto standards: HTTP segment streaming and buffer-based playback.

We all know that the basic premise for live streaming via HTTP is, at best, a flawed effort. However, the idea gained traction when two key trends coincided. First, corporate customers complained that their IT departments were blocking streaming ports on routers and data switches, eliminating the ability to watch external live-streamed content inside the corporate firewall. Second, premium content providers were having difficulty scaling live-stream delivery since most real-time media protocols required expensive servers that typically only served fewer than 100 simultaneous viewers.

Streaming via HTTP solved those problems. The devil in the details, though, is the nagging permanence of longer-than-broadcast latency. Even as the average internet user’s data rates have increased, often to the point at which segmented HTTP manifest files don’t need to switch between renditions, we’re still holding on to the HTTP-segment crutch as we limp toward the finish line of global live-stream delivery.

The second workaround, the buffer, is an even bigger impediment to streaming innovation and global synchronous delivery.

Unless you’ve been around the industry for the full 21-plus years of innovative chaos, you’ve probably never heard of Burst.com and the David versus Goliath fight it waged successfully against Microsoft around the patented concept of buffer-based playback. The short version is that Burst.com created a solution for real-time video delivery, attracting the interest of pundits like Robert X. Cringely and investors like the band U2, as early as 1994. After U2 used the technology to stream a concert in 1997, it became apparent that acceptable visual quality with then-state-of-the-art codecs far exceeded the available bandwidth the average consumer had access to.

Burst.com created Burstware, which monitored the bandwidth and watched for the frequency of stalls in a consumer’s video stream. If playback stalled too frequently, Burstware would pause playback and inform the consumer that it was going to buffer enough of the stream to allow him or her to watch the remainder of the video uninterrupted—if only the consumer would be patient for several dozen seconds.

That solved the short-term issue, but ultimately was used by consumers as a way to watch streams at data rates far above their available bandwidth, typically double the data rate of their dial-up modem or early DSL or T-1 internet connection.

Microsoft saw a demonstration of the technology and added it to Windows Media Player 9. Burst then successfully sued for patent infringement. The idea of the streaming buffer had firmly caught hold at that point, leading Microsoft to buy a non-exclusive license for Burstware, thereby guaranteeing we’d be stuck with the concept of buffering for at least the next 2 decades of streaming.

In conclusion, I want to challenge the industry to strongly consider jettisoning both of these workaround technology solutions. Without a bold move to eliminate HTTP-segment-based streaming and buffer-based playback, there’s little incentive to challenge the industry to innovate with 21st-century technologies. Rather than relying on de facto standards from the 20th century, it’s time for the industry to develop innovative solutions that solve the joint issues of zero-frame latency and global scalability.

[This article appears in the July/August 2019 issue of Streaming Media Magazine as "Too Slow."]

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

The Math Behind the Magic

Whether we're talking about content delivery, player performance, live events, or even DRM, it's important to understand the algorithms that make streaming video tick.

Squashing the Bugs: 6 Steps for Solving Video Streaming Problems

As a video solutions architect, our columnist has a lot of experience dealing with streaming issues that others created. Here's how he goes about solving them.

Streaming Video Latency: The Current Landscape

NGCodec CEO, Founder & President Oliver Gunasekara breaks down the low-latency landscape for distribution in this clip from a Live Streaming Summit panel at Streaming Media East 2019.

What Happens When the Super Bowl Stream Buffers? Buffer Rage!

Previous Super Bowls offered less-than-satisfactory live streaming experiences. Will CBS Sports get it right for the masses this Sunday?

What Is HLS (HTTP Live Streaming)?

Apple's HTTP Live Streaming (HLS) protocol is the technology used to deliver video to Apple devices like the iPad and iPhone. Here's a primer on what HLS is and how to use it.

HTTP Streaming: What You Need to Know

Understanding this hybrid delivery approach is key to advancing robust streaming solutions.