-->

Akamai Offers Best Practices for Troubleshooting Live Streams

Article Featured Image

While consumers and service providers have grown accustomed to flawless quality for live programming through traditional TV services, live event streaming is still making strides toward reaching the same level of consistency.

Live streaming is an extremely complex animal. For one, the lead time to prepare content is much less compared to on-demand delivery. Additionally, factors like time of day, audience size, user device, user geo-location, workflow components—such as CDNs, encoders and analytics—and last mile bandwidth, can all impact the quality of your live stream as perceived by the end-user.

With so many variables and obstacles, it is likely that service providers will experience some minor hiccups at points along live OTT workflows. But in this consumer-centric world that we live in, even a slight issue can tarnish your brand and adversely impact subscriber and revenue numbers. For example, Akamai data shows that just one instance of buffering could cost service providers more than $85,000 in lost revenue.

Simply put, service providers need to be able to troubleshoot their live OTT video—and quickly. By putting an emphasis on continuous workflow performance monitoring, prioritizing communication with major stakeholders, and leveraging new advancements in machine learning, the industry can best prepare itself to mitigate any speed bumps along the way.

Continuous Workflow Performance Monitoring

Growth in scale, complexity, and versatility of the streaming media landscape has made continuous workflow monitoring critical. The goal of a service provider is to deliver a consistent, high-quality experience, particularly for the growing audience. To realize that, critical data is needed in real-time that provides detailed insights about the viewer's experience, how you stack up against baseline performance numbers, plus key metrics across the workflow.

Whether it’s measuring audience sizes, the viewer’s path through your services, overall stream quality metrics, or identifying what content is (or isn’t) moving the needle, data can be a service provider’s best friend. And when a problem arises, like when a stream buffers or goes down completely, visibility into real-time performance data can help teams quickly identify and mitigate the problem before it impacts widespread viewer experience.

Thinking more strategically and down the line, this backend performance data can also help ensure similar issues don’t arise again. By analyzing data from previous major live streaming events, service providers can have a clearer picture of what to expect in similar instances moving forward.

Proactive, Real-Time Continuous Communication

The complexities in streaming live events mean that there is no single solution or platform that can provide full visibility into every step along the workflow or help mitigate every potential challenge. This accentuates greatly the need for proactive, real-time, and continuous communication channels.

By working closely with ISPs, CDNs, and third-party monitoring solutions throughout a live event, service providers can collaborate to identify and quickly mitigate any issues. And it can also help create a framework for best practices ahead of time.

For example, streaming massive events like the Super Bowl or the Olympics isn’t something that just happens day-of—organizations have to begin prepping several months in advance. By emphasizing this collaboration, service providers can build out the proper framework to ensure they not only have the best infrastructure in place to prevent issues, but that they also have the best strategies and processes in place to solve them in the event that they do happen. Referencing past events and lessons learned is critical to successful vendor collaboration.

Leveraging AI and Machine Learning

As live event streaming continues to grow exponentially in scale, the ability for humans alone to identify quality issues is becoming more and more limited. With enhancements in AI and machine learning comes greater visibility into the OTT workflow to help streamline the troubleshooting process.

AI, for example, allows for predictive analytics. While looking at previous data sets manually, as mentioned above, can provide general insights and a guided direction to move forward, AI can make this process nearly immediate—and more accurate—through advanced simulations. For instance, AI can help service providers anticipate audience sizes and peak traffic levels when an event is streamed at a particular time of day, day of the week, or time of the year.

AI even gives us the ability to reconsider which metrics matter. For example, instead of measuring what might seem like an endless supply of data that can oftentimes be more misleading than helpful, consider an AI-based simulation that views content in way similar to how a human would. With AI, service providers will one day be able to identify quality issues from a user perspective without needing a team to physically be present to monitor a stream. AI allows service providers to move away from metric-based measurements and towards perception-based measurements that truly impact the user experience.

Moving Live Event Quality Forward

As the OTT video market grows, so does the potential for roadblocks to delivering high-quality, flawless live event content to viewers. However, by placing the right emphasis on monitoring processes and solutions, communications, and leveraging the emerging benefits of AI, service providers can be prepared to not only quickly resolve issues as they arise, but reduce the chances of them happening in the first place.

 [Editor's Note: This is a contributed article from Akamai. Streaming Media accepts vendor bylines based solely on their value to our readers.]

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Akamai Reports Improved Performance with BBR Algorithm

The Bottleneck Bandwidth and Roundtrip Time (BBR) algorithm is now being deployed to 80% of the CDN's customers, resulting in performance improvements of up to almost 19%.

7 Steps to Live Stream a Local Sporting Event Like a Pro

In the age of streaming video, as long as you have the right equipment you can live stream just about anything for the world to see. Here's how to make sure remote fans can enjoy every play of the game.

Akamai Creates Edge-Based API to Combat Piracy by Blocking Access

CDN Akamai announced enhancements to its Intelligent Edge Platform designed to improved security, and one could give video content owners an advantage over pirates.

Akamai Explains the Security Risks With Streaming Video Services

Data breaches aren't just a concern for financial platforms. Hackers also target OTT platforms. Here are three best practices for keeping subscriber data safe.

Buyers' Guide to Live Transcoding 2019

Whether it's enterprise, ecommerce, news, or gaming, it feels like everyone is rushing to go live. Here's a guide to services that will get any company streaming now.

Akamai Finds Widespread SVOD Credential Attacks From Hacking

Entertainment and gaming services are increasingly suffering from credential stuffing attacks, in which hackers attempt to gain access using stolen IDs.

Video: Objective Quality Metrics--What You Need to Know

Streaming Learning Center's Jan Ozer lays out the basics of objective quality metrics for encoding assessments in this clip from his presentation at Streaming Media West 2018.