-->

Producing Video Case Studies

Many organizations use video case studies to help market their products and services. I recently analyzed eleven video case studies, focusing on high level production techniques, how the video was encoded, and how the video was presented on the companies' web pages. This article presents my findings, and should be useful to marketing professionals, video producers, compressionists and web developers.

I'll present the findings in four sections:

  • Useful statistics
  • Good techniques to emulate
  • Mistakes to avoid
  • Best practices for case studies

As an overview, I identified the case studies by Googling "video case studies" and trolling through the results. To use the case study, I had to be able to download it, it must have been created after 1/1/2009 (to the best that I could determine) and had to be used to sell a product or service. 

Ultimately, I found videos from companies like Xerox, Comcast, Panasonic, Fujitsu, Tandberg and Blackberry, just to name a few. To me, it looked like all the videos were professionally produced and edited; none appeared to be the work of an ambitious marketing director with a camcorder and no budget. 

Again, when analyzing the videos for this article, I focused primarily on video encoding and web site presentation. In a future article, I'll examine the actual content and discuss issues like the how the marketing claims were presented and proved, the use of B-roll vs A-roll, how and where background music was used and the like. 

Let's start with some useful stats about encoding and presentation. 

Useful Stats

Here are some useful stats from the videos that I analyzed. 

  • Duration: The average duration was 4:04 (min:sec) with the longest at 8:20 and the shortest at 1:33.
  • Resolution: The average resolution was 413x258, with the largest at 640x480, and the smallest at 220x164.
  • Video data rate: The average video data rate was 466 kbps, with a high of 917 kbps, and a low of 200Kbps.
  • Bits/pixel/frame: The average bits per pixel per frame was .24, with a high of .557 and a low of .061. This is a measure of how much data was applied for each pixel in the video and it lets you compare, for example, the compression applied to a file encoded at 640x360@448 kbps (b/p/f = .065) and a video encoded at 220x164@250 kbps (b/p/f = .577). The easiest way to derive this number is to download the free, cross-platform MediaInfo utility from here and analyze your file within the program. What's the significance of the bits/pixel/frame? My rule of thumb is that any value over .1 is likely to be a waste of bandwidth, though at resolutions below 400x300, .15 is acceptable, since codecs are less efficient at lower resolutions. If you see a value of .557, you can pretty much assume that the data rate is way higher than it needs to be. Click the Figure to see a larger view of MediaInfo, a tool I have installed on all of my computers. 

    cases1.jpg
    • Codec: Six of the ten videos used the VP6 codec, while three used the antiquated H.263 codec, for which there is no explanation if the videos were encoded post 2008. Three others, which the respective web site produced and displayed via YouTube, were encoded with H.264. All the videos were presented in the Flash Player. To be fair, Microsoft and Cisco both presented case studies in the Windows Media Video format, but both were produced prior to 1/1/2009 so missed the cut.
    • Audio data rate: The average audio data rate was 102 kbps, with a low of 16 kbps and high of 256 kbps, with nine of eleven streams produced in stereo, two in mono. My rule of thumb for a case study, which is primarily recorded monoaural speech, is 32Kbps for AAC audio and 64Kbps for MP3 audio. Obviously, my thumb was not familiar to most of these producers.
    • Use of social media: Here's a surprise; if you don't count videos embedded via YouTube (which are all linkable and embeddable), only one of the case studies was linkable and embeddable, and one other offered the ability to email the page into which the video was embedded. None of the other videos offered any social media or sharing links whatsoever. Obviously, some room for improvement here.
    • Call to action: Another surprise was that five of the eleven videos didn't have any special call to action, and several were blank players with no links at all. We learned in Marketing 101 that all marketing documents must have a call to action; it makes no sense to spend thousands on a case study and not make it easy for the inspired viewer to take the next step.
Those are the high-level stats. For a more detailed look at the positive attributes I found during my survey, as well as sample videos, see the rest of this article at StreamingLearningCenter.com

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues