Content Delivery Summit: Google Talks YouTube Views, Caching
“Mobile makes up almost 40% of YouTube’s global watch time,” said Keith McCallion, Google’s technical program manager for peering and content delivery.
McCallion spoke on the topic of “Google’s Approach to Content Delivery” at the 2014 Content Delivery Summit taking place today at the New York Hilton Midtown. The session focused on ways that Google, specifically for YouTube content delivery, works with network operators to deliver content and services efficiently and cost-effectively and aims to deliver its services with high performance, high reliability, and low latency for users.
To understand the scale Google faces with YouTube, McCallion showed a number of statistics, such as the number of users (well above 1 billion) and hours of video uploaded every minute (100 hours per minute).
Delivering at this scale requires going beyond the typical solution, and McCallion emphasized Google’s commitment to increasing quality globally.
“We have a number of data centers around the world,” said McCallion. “We then have a large backbone, which we’ve spoken a bit more about in the last year, especially when it comes to SDN” or software-defined networking.
McCallion emphasized the fact that Google invests heavily in infrastructure and has increased its spend on infrastructure over the past few years.
He then showed the result of that investment, in the form of Google’s edge points of presence (PoPs) in North America, South America, Africa, Australia, Asia, and Europe.
“These PoPs connect via the Google backbone,” said McCallion, calling out work in Africa and India. “We’re willing to invest where others are not.”
Going beyond in investing time and resources to increase the quality of the YouTube viewing experience was also a theme when it came to working with network providers. That includes the caching technology the company is now talking about, but which has been in beta for a number of years.
“We work closely with network operators around the world, while respecting open internet principles,” said McCallion, noting that the company balances delivery needs with prioritization concerns. Yet, he noted, open internet principles win out.
“Google doesn’t ask for the prioritization of its data,” said McCallion. “We also go beyond what most people do with serving from the edge. We now have Google Global Cache or GGC. This is one of the ways that a network provider can put the cache in the same location where they terminate viewer sessions.”
McCallion then went on to show examples of traffic data, as well as the YouTube Video Quality Report.
“We get billions of measurements per day, based on the number of viewers we have watching YouTube videos each day,” said McCallion. “We can break measurements down by a city level, and in a large city, we can even break it down within the city.”
This detailed performance data is available to internet service providers (ISPs) that, according to McCallion, the ISPs could use to help optimize delivery.
“We’ve taken that data we’re sharing with ISPs and put it into a report that’s easy for consumers to understand,” McCallion said, noting the Google Video Quality Report has launched in Canada.
“The idea is that this will upwell users to data rates that will sustain HD data rates,” said McCallion. “We launched this in Canada and expect to roll it out in other markets as well.”
McCallion closed with the fact that Google is constantly trying to improve the quality.
“The fact that YouTube viewing has gotten better over the last few years is due to the money we’ve spent on infrastructure,” he said.
One audience member asked about the cache ratio in a typical GGC.
“It varies dramatically,” said McCallion. “The active content directory is based on the market you are in. The range is anywhere from 60% to as high as 90%.”
He noted that the average is also based on the age of the GGC platform, as there are older caches in some markets where the hard drive sizes were smaller when the cache was installed.
“We are not moving to a hardware-agnostic solution, because we’re deeply integrated all the way down to the NIC,” said McCallion, noting that the hardware design is a Google custom design.
Related Articles
Is the fourth time a charm for Google? Try, try again, says Google, as tech writers take potshots at its latest TV effort.
27 Jun 2014