-->

Video: How Can We Deliver 360 VR Streaming Beyond 4K?

Learn more about 4K VR 360 at Streaming Media East.

Read the complete transcript of this clip:

Devon Copley: In the video production pipeline, the broadcast pipeline, the streaming pipeline, the rendering pipeline on devices, battery life, CPU power, GPU power, all those things have been scaled for standard HD video, and now to some degree for 4K video. That's allowed us to get to 4K spherical--that is, 4K all the way around, or 360. But getting past that is a big challenge, because at every point in that production chain you need to upgrade to something that puts you in uncharted territory. Obviously there is 8K television, but the standards don't exist, the interchange formats are still under debate. And really, only NHK and Sony have deployed anything and it's astronomically expensive. This stuff doesn't exist in the field. On the client side, you have decoders on mobile devices that max out at 4K. And then of course in between you have the challenge of bandwidth.

How do you address those problems? We have already seen, just in the last few months, 360 capture devices on the market that can capture it 6K or 8K. These are using built-in compression chips, which compromise quality to some degree, but they get the data down to the point where it can be delivered. Then you run into the problem of broadcast production. You can't use those 4K switches anymore, there's no 6K switch that you can buy off-the-shelf at the local store. Blackmagic doesn't make one yet. To get to to 6K or 8K, that workflow needs to be reinvented.

Then delivery becomes an issue, because you can't deliver a 50-megabit stream today, or next year, or the year after that, or the year after that. Even if you could do it, it's probably not economical.

How's that going to get solved? That comes down to what's variously attributed as viewport-adaptive streaming or field-of-view streaming. For those of you who aren't familiar with this, this is analogous to adaptive bitrate streaming, except, instead of having different bitrates you send different parts of the sphere at different resolutions.

There are multiple implementations for this in the market now. There is an in-process MPEG OMAF standard, a directional media format standard, which is being published later this year.

Nokia has developed our tile media standard, which has the benefit of being very simple. It requires no changes to the codec, no changes to the CDN; it's simply a protocol for defining in the manifest how a sphere can be broken up into separate tiles that are delivered as separate streams, to be assembled on the client side. It has already been implemented by a couple of vendors. I know Harmonic is looking closely at it, and has done some implementation on some similar schemes as well.

There will be some announcements from Nokia in the coming weeks. I can't actually make one now unfortunately, but very soon you'll hear more from us about how we hope to enable the community to use this standard more broadly, mostly as a stopgap between now and the time when OMAF and the proper standards emerge from the standards bodies. This is something that's quite easy to implement.

The upshot is, you don't deliver the full sphere at full resolution, with the result being not only a lower bit rate, so with no other changes, you get 40-50% bitrate savings, but also significant benefits on the clients side. So the client doesn't have to decode the entire sphere, and this allows you to drive higher resolutions to 6K, or even 8K monoscopic, 6K stereoscopic, on devices which only have a 4K hardware decoders, because you don't actually have to decode the entire frame all at once.

The result is that in 2018 you will see in 2018 the emergence of 6K, 8K broadcast-capable pipelines. As a first step toward that, just three weeks ago, we worked with several partners to deploy at the Oculus Connect conference, the first, as far as I know, the first live public livestream that had dual 4K, so full 4K per eye, had live mixed ambisonic audio, and had viewport-adaptive streaming. So we were delivering 4K per eye at about 12 megabits a second. That’s twice the resolution that we had here not even a year ago, at half the bandwidth.

Those advances are on the way. The bigger challenge then becomes, as I mentioned before, the broadcast production pipeline. How do you do a proper broadcast at 6K, or 8K, or 8K stereo? Those tools don't exist, so that becomes the next hurdle.

 

 

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Video: What Works Today in Video VR?

Streaming Video Alliance's Jason Thibeault, Wowza's Chris Michaels, and Neulion's Jim Clements discuss what content owners can do today, strategically and technically, to make VR viable in the consumer entertainment space.

Video: When Will VR Entertainment Deliver ROI?

SkyVR's Bruce Courtney, Neulion's Jim Clements, and Nokia's Devon Copley debate the challenges VR content creators face in creating quality VR content that leverages the medium's strengths and delivers a profitable return within a limited market.

Video: What 360 Streaming Can Deliver That Other Media Can't

NBC News 360 VR Video Specialist Edwin Rogers discusses the unique ability of live 360 VR streaming as a news platform to put viewers in the center of the action at world events in this clip from Live Streaming Summit.

Video: The Problem With Streaming 4K Today

Limelight Networks' Charles Kraus describes the obstacles to real-world 4K delivery while the industry pushes 4K screens everywhere, in this clip from Streaming Media East 2017.

Video: How to Reduce Latency for Mobile VR Streaming

Yahoo Director of Engineering Satender Saroha discusses latency issues particular to VR streaming to mobile and technical measures to address them.

Video: VR vs. 360°: What's the Difference?

Scott Mayerowitz, Digital Storytelling Editor at the Associated Press, breaks down the differences between "VR/Virtual Reality" and "360," two terms often used interchangeably.

Video: How to Gear Up for Live 360/VR Streaming

Tim Dougherty of Wowza Media Systems lays out the key components of a workflow for capturing, ingesting, and streaming live VR/360 content in this clip from Streaming Media East 2017.

Companies and Suppliers Mentioned