-->
Save your FREE seat for Streaming Media Connect in February. Register Now!

Video: Three Ways to Replace Flash for Low-Latency Live Streaming

Learn more at Streaming Media East!

Read the complete transcript of this clip:

Charlie Kraus: We’ve got to replace Flash, so here are three ways to do it. Two of these actually are in production today. People are actually doing this stuff today live. One is live-streaming small chunk size. One clear way to get it down, instead of 10-second chunks, why don't you reduce it to one-second chunks. If you do that, you're down to about three seconds before you send it out, then you have a little bit of time to ingest it, encode it, et cetera.

In production, we're seeing something like 5-6 second actual real latencies out there. That's pretty good compared to 30 or a minute and a half. And in this way we're doing the transcoding and the CDN so it's just coming in RTMP and we transcode.

A lot of people like to do their own transcoding and they're good at it. They like to do it. They like to have that control. And all they want to do is use the CDN for distribution and they can do that too. And if you do that, when we get in, when you transcode it and send it to us, you do the chunking down to one second. You send it to us and we just zip it across the CDN real quick. They just want to take advantage of the points of present CDNs have globally for distribution. And then you're going to see about three seconds.

But ultimately we'd like to get it down to one second like flash and WebRTC will do that. One caveat on the small chunk size: that sort of depends on how good and clean your networks are per the previous discussion. There are parts of the world where that ain't going to work. You can start at one second and see how it goes. The customers that are doing this have tried that first. If one second doesn't work, try two. They try to find where the sweet spot is. In markets that have really good internet connectivity like parts of APAC, most of North America, Central Europe, it's working really well. I can tell you in India you're going to be digging that chunk size pretty big.

Everything just gets reduced by an order of magnitude. Chunk size goes down so the start time is really quick. Now it's one second to go out and then you have a little bit. After that and in production as I mentioned, we're seeing about five or six seconds. Here RTMP comes in and HLS or DASH goes out. That's built into the CDN.

Or you can do this, you can do your own encoding, set that size, deliver the already-encoded DASH on chunks to us and we'll just zip it right through and accelerate it and get it out. This is also in production and we're seeing realistically about three seconds latency.

At the moment that's pretty good until we get to the real thing. Now is WebRTC going to be the thing? The jury's out. We're sort of jumping ahead. We're trying to get ahead of the game and figure the earlier you jump in and play around with stuff the more you learn before everybody else gets in it. What we like about this is almost all browsers support it. The only laggard is Windows but that will happen in 2019 but all the other browsers support it already today so that's kind of nice.

WebRTC has actually been around a while. This is a Google creation and it was originally designed for peer-to-peer browser communications, so what's being used here is only a subset of its capabilities. We're not doing peer to peer. We're just doing one to many, so we're just doing unidirectional streaming.

But there's some capability in there to do peer to peer in the standard and there's also a built-in data channel so you could do some out-of-band communications. This has a lot of play in the gambling industry but there's some capability there. We're not trying to take advantage of these other two things yet but keep it in mind for the future.

What's different about it? It's not TCP, it’s UDP. These are persistent-setup connections, and the way you deal with network problems is you just jump to a lower-bitrate connection that's already been set up. Instead of having to retransmit and so forth, it just jumps connections. UDP is the only way this would work. This is not chunk streaming.

How do you do the transition? Many CDNs have stopped supporting it. Some have not. There are several that have not. We have not. We continue to support it. We have customers that continue to deliver it, especially with gambling and they're still using Flash. The customers are using plugins in their browsers to be able to do that. That's how they're supporting it. All these browsers that support WebRTC, most of them will let you, if Flash comes in you have to explicitly okay it. You click and then go. That's, you have to explicitly opt in to do that.

In this transition, we'll output both Flash RTMP and HLS and DASH, and when 2018 comes and we're ready, then another output will just become WebRTC. In the transition you have the option doing all these formats and as your users and adoption comes, you can do that without changing your workflow. It's just all there very flexibly.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Flash and the New Era of Media Streaming: The End or the Beginning?

Microsoft's announcement that its removing Adobe Flash from Windows 10 is the final nail in the coffin, but the robust RTMP-based ecosystem that's grown around Flash is still thriving.

Flash's Last Hurrah

Most content publishers have already moved on from Flash, but those who need ultra-low latency have stuck with it. Now that Adobe is ending support for Flash, it's time to move to WebRTC, but it won't be easy.

Video: What Does Low Latency Really Mean?

RealEyes' David Hassoun discusses what low latency is and what it isn't, and sets reasonable expectations for the current content delivery climate.

Video: Why Scale and Distance Make It Harder to Deliver Low-Latency Streams

Wowza's James Jackson discusses the persisting challenges to delivering low-latency streams for large-scale events with far-flung live audiences and emerging strategies for meeting those challenges.

Video: How Important is Ingesting Clean Video to Reducing Latency?

NASA's Lee Erickson explains the value of ingesting a clean signal into an encoder to reducing latency after output.

Video: Is WebRTC the Silver Bullet for Network Latency?

Streaming Video Alliance's Jason Thibeault and Limelight's Charley Thomas address the question of whether WebRTC provides a viable solution for network latency issues in this panel from Live Streaming Summit.

Video: How Much Latency Will Consumers Tolerate for NFL Live Streams?

Amazon's Keith Wymbs and Jim De Lorenzo discuss how they've met the challenges of improving latency and time-to-first-byte to serve the millions of viewers who are tuning in to Amazon's Thursday night NFL broadcasts in this keynote from Streaming Media West 2017.

Video: How to Reduce Latency for Mobile VR Streaming

Yahoo Director of Engineering Satender Saroha discusses latency issues particular to VR streaming to mobile and technical measures to address them.

Video: Best Practices and Key Considerations for Reducing Latency

Wowza Senior Product Manager Jamie Sherry discusses key latency considerations and ways to address them at every stage in the content creation and delivery workflow.

Video: Do You Really Need Low-Latency Streaming?

Wowza's Mike Talvensaari confronts the myth that low latency for large-scale streaming is always worth the expense, and discusses various applications and use cases along a continuum of latency requirements for effective delivery.