Deepfakes and the War on Reality
"You want to make sure content is coming from the person that it says it is, and you want to make sure it's not been tampered with," says Peterka. "If I am The Wall Street Journal, and I want to make sure I'm not publishing any fake news or fake videos, I may ask the sources to provide some proof of authenticity." This scenario could be an important part of content acquisition for publishers going forward. "You may need to actually sign that video and prove that it was you. It still doesn't guarantee that you didn't create a fake, but at least they associate you [with this content]," says Peterka.
Why Are We Surprised?
Building imaginary content is nothing new. "Everything we do is 'fake'—we're in the movie and TV industry," says Craig Seidel, senior vice president of MovieLabs.
"People working in the film industry have been doing face swaps for quite some time with the kind of highly sophisticated techniques," says Rautiainen. Of course, in fictional film and television, the audience knows what they're seeing is "fake."
CGI has developed entirely new environments, some obviously manufactured and some more subtle. "When you take a movie and decompose it frame by frame, the same way as the CGI added layers of imagery, you should be able to detect that it was actually a layer of graphics on top of the origin video," says Peterka.
It's not only the entertainment industry that manipulates images. Photoshop has been used to make photos more aesthetically pleasing, says Rautiainen. "This kind of image manipulation has been in use for decades. Now this next generation is actually start starting to work on layers that really break the previous rules," he says. (It's worth remembering the Time magazine O.J. Simpson cover, where Simpson's face was manipulated to appear darker than it actually was.)
Seidel is a mentor for an entrepreneurship class at Stanford, where this next generation of students are starting to consider what the effect of altering content might be.
The team conducted more than 100 interviews as they looked for an industry for which they might build their student product. The students thought military, social platforms and media would be potential markets, but their research encountered roadblocks from confidentiality issues (military) to social platforms and media outsourcing the problem to third-party organizations for fact checking. In the end, it was the fact checking groups who were most open to talking to the students; you can view their final report here.
Going To Market (In the Future)
For now, there's no commercial or scalable tool or service for detecting deepfakes.
"It's an issue that we have been doing a lot of thinking about," says Chu. "There are a half dozen folks that have projects underway, but no one has a commercial offering yet in our analysis."
"This is really, a daunting task, because you would have to scan 30 frames per second in a one-minute video clip of someone speaking. A word can come out in sub one-second, right? So in one minute, there's 60 times 30. Now you're talking about 1,800 frames that you have to go scan individually and maybe catch a half a second where someone said a one-word difference," says Chu.
"There's lots of machines in the sky, but to be able to truly police it, it's massive scales of computing that don't exist today, along with, the scanning algorithms, that have yet to be written," he says. "We're really years away from a real commercial implementation at scale."
"I can foresee a future where cameras for example start encoding the media so that it can be originally traced back to a device," says Rautiainen. Another method to ensure content authenticity would lie with the distributors. Media platforms, whether traditional or online, could provide some kind of certification of the authenticity of the clip, says Rautiainen. "The media distribution platforms I think need to become responsible on the authenticity scheme, like the internet bodies who standardize and regulate the internet have created standards for certification of the authenticity of the website, so the 'man in the middle' attacks wouldn't happen," he says.
Taking Action
There are a few laws which have been put in place. California became the first U.S. state to criminalize the use of deepfakes in political campaign promotion and advertising, making it a crime to publish audio, imagery, or video that gives a false, damaging impression of a politician's words or actions within 60 days of an election. Another law allows California residents to sue anyone creating revenge porn (putting someone's head on another person). There are certain exemptions for content that features clear disclaimers.
"Once a deepfake is detected, identifying information should be published so it can be blocked or tagged (as appropriate). Content identification through techniques such as fingerprinting should be able to identify the fake in the wild," says Seidel.
"I anticipate people will voluntarily block deep fakes. For example, I'd be perfectly happy for Chrome to flag or block deepfakes in the browser. Hopefully, social and UGC sites will also voluntarily block (this)," says Seidel. "However, there will always be critics of blocking anything, so legislation might be necessary to give sites ‘cover'. Presumably tools will be available to journalists and others interested in the truth to detect."
"I do believe that there is a commercial market for it because people want to know how genuine something is or isn't." says Chu.
Until commercial technology becomes more readily available, our experts passed along these three steps to consider.
- Trust but verify all content.
- Ensure chain of custody on content to identify content origins.
- Don't believe everything you see.
"There is a responsibility on us as consumers and users, and sometimes it feels to me that this is being somewhat ignored," says Peterka. "I think we have a responsibility to apply common sense and critical judgment before we act on things or before we forward them."
Fact or fiction, some video content is just too good not to share, and this is the problem every news organization, social media and technology companies will be battling with. This is only round one of what has the potential to be a drawn-out, multi-front war on reality.
Companies and Suppliers Mentioned