-->

DigitalSmiths Releases VideoSense 3.0; Metadata Geeks Rejoice

Article Featured Image

During last year's Open Video Conference, Digitalsmiths' chief technical officer talked about how metadata drives VideoSense, the company's asset management and indexing solution.

"Our system does asset management, storage, and playback, but the advanced metadata aspects are what set us apart," said Matt Berry. "We've got the ability to cut the tracks of metadata much like the way an Avid (editing system) would cut different tracks of video, creating custom metadata tracks."

This week, the company announces the release of version 3 of VideoSense, with an updated user interface (UI) to allow users to drill down through time slices to gain deep access to metadata.

"Our strategic direction is now focusing on metadata and time-based aspects," said VideoSense product manager Tim Jones during a call last week. "The revamped UI allows us to repurpose real estate away from scheduling and more toward many, many layers of metadata content per segment, shot, or even just a few seconds of a shot."

"Our studio clients have shown us that a genre may change from scene to scene within a single movie," Barry said during his Open Video Conference session. Jones says they've found that content indexing needs to be tightened down into the sub-shot level.

"We look at metadata across a timeline much like the concept of a mixing desk, with individual tracks of data," says Jones. "Our solution combines automated indexing-taking closed captioning, keywords from speech-to-text, facial recognition, and scene-change detection-with manual indexing about particular content within a scene or shot, so our customers can tag very quickly what's in that shot."

Jones explained, during a demonstration, that content can be displayed in a variety of colored tags to designated automated indexing, manual metadata entry, or a combination of the two.

"Orange may show the actors, while grey may show automated indexing of metadata," he says. "What's important is that information on a particular track can easily be copied so that it can appear in other shots or segments. For instance, if a bus appears in a portion of multiple shots, the metadata track created for the portion of one shot can be copied and rapidly applied to the other shots."

One of the biggest issues around the presentation of deep metadata is how to handle blank tracks. VideoSense 3.0 allows the users to both filter and nest tracks, leaving the most pertinent tracks available to be drilled down into vertically. Tracks with nothing in them for a particular time slice remain viewable, but blank.

"The new version refines the UI workflow even more, meeting the needs of customers who need to work more with deep metadata in a vertical sense than a horizontal sense," he says.

Digitalsmiths claims it has indexed more 3 million tracks of metadata for its key customers, the major studios, into MetaFrames, Digitalsmith's name for time-based metadata repositories. The repositories include 29,000 pieces of premium video content, equaling approximately 200 terabytes of content.

When asked about the competition-such as Pictron, which focuses on news solutions-Jones says he didn't see many other companies focusing on the level of premium content that Digitalsmiths works with.

"We're coalescing around movie and TV shows for our major clients-the studios," says Jones. "Various other companies may be doing a variety of pieces, but our timeline-based UI is the best on the market for our target customer base."

Finally, when asked about the limited ability of search engines like Google to find content within a portion of a video clip, Jones said that metadata frameworks like MetaFrame are critical to allowing the average user to see the value of deep metadata.

"We've moved quickly to establish a unified framework for the creation, aggregation, and distribution of rich metadata," said Jones. "Our customers see the value of both rapid and deep metadata and content tagging, and we suspect that the search engines will perform their own indexing better against a unified framework."

As if to prove that latter point, Digitalsmiths quotes an industry analyst on this topic in its VideoSense 3.0 press release.

"Google's new more timely and media-rich web indexing system, called Caffeine, increases the importance deep metadata plays in the monetization of content," said Colin Dixon, senior partner at TDG. "Using a time-based metadata technology is the only way to ensure that new content is found by Caffeine fast."

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Metadata: What You Need to Know (And Why You Need to Know It)

Want people to be able to find your video online? Then you need to pay attention to your metadata.
Thurs., Nov. 18, by Tim Siglin

The Past, Present, and Future of Metadata

Metadata might be really boring, but the things that can be done with metadata are very cool and absolutely crucial to the future of online video. A recent panel discussion at the Open Video Conference exposed the challenges and opportunities metadata poses.
Wed., July 1, by Tim Siglin

Media Roundtable Gets to the Core of Open Video

During last week's Open Video Conference in New York, a roundtable of speakers representing Kaltura, Mozilla, blip.tv, and Harvard's Berkman Center discussed the opportunities and challenges facing open source video initiatives.
Tues., June 23, by Troy Dreier