How to Choose the Optimal Data Rate for Live Streaming
When the only tool you have is a hammer, every problem looks like a nail. When you have objective video quality measurement tools, you can use them to check all the assumptions that you’ve made in the past. One of those assumptions is that copious outbound bandwidth is necessary to achieve high-quality live streaming. While this is true to some extent, as demonstrated by the tests I undertook for this article, you quickly reach a level of diminishing returns.
Object of the Exercise
The object of the exercise was to show how output quality was affected by the data rate of the incoming video in a cloud transcoding scenario. Specifically, sites such as Ustream, YouTube Live, and Brightcove/Zencoder input a single stream and transcode in the cloud into an adaptive group of streams. While higher data rates are always better, outbound bandwidth from many sites is limited and can be expensive in an offsite conference or similar scenario. The question is this: How much difference will a typical user see between an affordable 3Mbps 720p stream and a possibly extravagant 6Mbps 720p stream?
As it turns out, not so much.
How I Tested
I tested with two clips. One is a talking head clip of me, the other a song from a recent concert appearance of the Loose Strings in Galax, Va. The Loose Strings clip is a single-camera concert shoot with lots of pans and zooms. It’s more challenging than a talking head clip, but not exactly a chase scene from Mission: Impossible when it comes to encoding complexity.
Both clips started life as 1080p AVCHD, which I copied to disk, imported into Premiere Pro, added timecode, and output as ProRes. I then played the clips out using a Blackmagic Design DeckLink HD Extreme 3D card via HDMI and captured with a Matrox Monarch HDX.
For reasons I’ll explain in a moment, I first captured a 20Mbps 720p stream, then a 10Mbps stream, then I captured at decreasing data rate values down to 1Mbps. These served as the source test clips.
In a transcode scenario, you don’t care about the quality of the incoming clip, you care about the quality of the clip encoded from that source clip. To measure this, I created two presets in Sorenson Squeeze, one for 640x360 at 1.2Mbps output, the other for 1280x720 at 2Mbps output, both using x264 with the Fast Encoding preset that many live producers use during transcoding. Then I input all the source clips into Squeeze, and encoded with those presets.
(To keep us all sane, I’ll call the original Monarch-encoded clips the “source” clips, all produced at 720p, and the 360p and 720p Squeeze-encoded clips the “transcoded” clips.)
Next, I measured the quality of the source and transcoded clips using the Moscow University Video Quality Measurement Tool (VQMT), and the SSIMWave Video Quality of Experience Monitor (SQM), which you can read more about in “How to Use Objective Quality Measurement Tools.”
These tests raised some interesting test-related technical issues. By way of background, when testing the quality of encoded VOD clips, you start with the source clip, encode it, and compare the output with the source using the selected tool. Easy, peasy.
In a live scenario, the source file isn’t as easy to identify because it’s changed slightly so many times. For example, during my capture workflow, the clip was processed by the Blackmagic card to transmit via HDMI, and then scaled by the Monarch during capture. First, I tried comparing the Monarch-encoded source clips with a 720p clip that was scaled and output in Premiere Pro, but the small scaling and other differences were so substantial that they essentially masked the true compression-related quality differences.
So, rather than comparing the source clips produced by the Monarch-to-Premiere Pro output, I compared them to the aforementioned 20Mbps source clip. To analyze the transcoded clips produced in Squeeze, I encoded the 20Mbps clip using the same 360p and 720p presets, and compared all other encoded clips to those outputs.
I can’t say for sure that this is the best approach—and I’m open to suggestions for future articles—but this is the procedure I used for this exercise. Now, let’s get to the results.
Results, Please
Let’s start with a brief description of each test and what it purports to measure. The Moscow University test measures the comparative quality of the clips, with lower scores being better. Scores are not tied to anticipated subjective ratings in any way.
The SSIM wave test assigns a numerical score to each clip, and those numerical scores are designed to align with anticipated subjective evaluations. For example, all clips that score in the 80–100 range should be considered excellent quality by real-world viewers, while clips that measure between 60 and 80 should be rated as good quality by real viewers.
With this as background, let’s dive into the results.
Talking Head
Chart 1 shows the talking head results as measured by the MSU VQMT tool. As a brief aside, the Moscow University VQMT offers a range of different analyses, including Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index (SSIM), and Video Quality Metric (VQM), which is the measure I used in all my tests with this tool. With the VQM test, again, lower scores are better.
Talking head source and transcoded clips as measured by the VQMT. Lower scores are better.
The blue line shows the scores for the source clips, which starts high at 1Mbps and drops down quite nicely with the increased data rate, with the quality improvement continuing to persist even at the higher data rates. The red line shows the VQM values for the 360p transcoded clips, while the yellow line shows the values for the 1080p transcoded clips. As you can see in the chart, while the values continue to improve with the increased data rate of the source clip, you reach diminishing returns at around 4Mbps.
Related Articles
Whether you want a provider to handle everything or want to add cloud transcoding and packaging to an existing workflow, these are the questions you need to ask.
05 Apr 2016
Objective quality benchmarks provide exceptionally useful data for any producer wanting to substitute fact for untested opinions.
01 Apr 2016
A panel of experts makes predictions about what viewers will expect in five or ten years, and which content types, technologies, and monetization models the live streaming industry must embrace to meet those expectations.
18 Feb 2016
Companies and Suppliers Mentioned