Blurred Lines: How to Manage and Integrate On-site and Off-site Audience and Presenter Interactions in Live Webcasts
Recording a presentation that is delivered to a live audience for later on-demand viewing is nothing new. Neither is webcasting live presentations. But more and more the lines of what is considered the norm for presentations is getting blurred as online participants interaction is being incorporated with the live audience and presenters are delivering their talks remotely to both a live and an online audience. This article discusses the challenges producers face when managing both live and online audiences and presenters concurrently.
Recording a presentation that is delivered to a live audience for later on-demand viewing is nothing new. Neither is webcasting live presentations. But more and more the lines of what is considered the norm for presentations are being blurred, as online participant interaction becomes incorporated with the live audience’s experience and presenters delivering their talks concurrently to on-site and online audiences.
In this article I will share two very different methods and technologies for achieving this by describing streams I produced for two different clients.
Time-shifting a Live Production
The ability to manipulate the space-time continuum is the basis for many works of science fiction. For an observer standing on Earth, time and space are relative, but because the speed of light is 300 million meters per second, there is no way we will experience special relativistic time dilation unless we leave Earth. Both Einstein’s theories on relativity and later scientific tests sending atomic clocks in rocket ships that return slightly behind the time on control Earth clocks demonstrate that it is possible to experience time at a different rate than what you would if you remained on Earth.
More extreme examples of this phenomena are integral to the storylines of two of Matt Damon’s three recent space movies, Interstellar and The Martian, where characters age at different rates relative to what we experience on Earth, depending on how fast they are travelling or orbiting.
As much as the space-time continuum is fascinating to discuss and manipulate in works of fiction, on Earth, real-world video producers have to find other methods for controlling time as we just don’t have the luxury of controlling time dilation using rocket ships.
My first foray into time-shifting live productions was my most complicated to date. The client had four presenters who took turns presenting. Each presenter also showed a video or slides and was assigned her own dedicated camera to look into. For audio, each wore a wireless lavaliere microphone. On the surface this might not sound too complicated. Four-camera live switches with video and slide playback are the bread-and-butter of many event production companies and aren’t much more complicated than a two-camera live switch; you just have to manage more camera and computer inputs. What made this gig uniquely challenging was that instead of the four presenters being together in my studio in Greater Vancouver, Canada, only three were physically here. The fourth was joining us from a studio in Germany.
In our initial tests, the client impressed on us how important it was to have each of the four presenters appear equal in terms of their video and audio quality. The remote presenter and client of my client was the CEO of a company that, as of this writing, has a valuation of just under $100 billion. They quickly ruled out having the fourth participant connect via a satellite connection because none of us had that expertise. We couldn’t deliver the presentation via a Skype conversation because they didn’t feel this technology would be high enough quality.
The solution was to connect the remote presenter in Germany to my studio in Canada, via a high-definition webcast on Ustream Enterprise (Figure 1, below). The incoming webcast feed went into the video switcher in my Canadian studio, which also controlled all the live camera, video playback, and presentation slide feeds. The HD quality of the webcast was sufficient on the incoming signal and the gap between the live camera feed video quality and the webcast feed was lessened when the program output was then re-broadcast to a global online audience.
Figure 1. Here I am behind the controls of a live video switch and webcast using the Blackmagic ATEM 1 M/E and Ustream Enterprise. Click the image to see it at full size.
Now here is where the time dilation requirement comes into play. A “live” webcast has a broadcast latency of 15–30 seconds before it is received by viewers. This meant there was no way to cut both the in-studio Canadian video cameras and audio with the German webcast audio and video in real-time. We had no ability to mess with time dilation so we did the next best thing: We delayed our live in-studio video and audio to match the delay on the German webcast feed, and then cut the newly resynchronized video in “real-time”—15 seconds after it happened.
The technology I used for the delay was the NewTek 3Play 425 instant replay system (Figure 2, below). After we manually got the timing of the delay matched, we crossed our fingers that the latency would not drift, which fortunately it did not. This only solved the synchronization of both feeds relative to each other—albeit delayed by 15 seconds—but we still needed the presenters to have conversations in real-time on both ends so that they knew when it was their turn to speak. This was also important for the Q&A portion of the presentation when real (unplanted and unrehearsed) questions were read off-camera in Canada and presenters in both countries were asked to in-turn respond.
Figure 2. We used the NewTek 3Play 425 instant replay system to delay the in-studio video to match the delay on the German webcast. Click the image to see it at full size.
The Canadian studio presenters could hear each other as they were in the same room, but connecting real-time audio between Canada and Germany wasn’t possible through the webcast feeds, because of the delay. The solution came in the form of a phone line connection between the two sides using a phone bridge, with local playback on a small stage monitor (speaker).
It all came together, and although I definitely wish I had another chance at getting it just perfect or actually having a rehearsal before we went live, we successfully synchronized two sets of live and interactive video sources with a massive latency difference between the two (Figure 3, below).
Figure 3. vMix video switching and webcasting software showing a queue of remote presenters waiting their turn to go live on a webcast. The laptop on the right is receiving the webcast feed and has a 30-second broadcast delay. Click the image to see it at full size.