Seeing but not believing: Inside the business of 'deepfakes'

Experts fear deepfakes could become the next frontier in fake news.

December 10, 2018, 9:36 PM

In a video seen by millions, a man that looks and sounds just like President Obama gives an address. But instead of a polished speech, he spouts out controversial opinions -- and even a curse word.

A split screen reveals that it's not our 44th president making these comments after all -- but actor and frequent Obama impersonator, Jordan Peele.

In a Buzzfeed production, Peele imitates Obama's voice with video manipulation to match. It's what's called a "deepfake" -- a video created with artificial intelligence used to convert video of real people into potentially damaging doppelgangers, appearing to say and do things the real people never actually did.

In this case, it's Peele and Buzzfeed using the technology to create an sort of PSA, and reveal their trick to the viewer at the end.

But experts fear that in the wrong hands, deepfakes could become the next frontier in fake news – and spark very real consequences.

“The challenge is it goes to the heart of our trust in visual media,” Dr. Matt Turek, head of the media forensics program at the Defense Advanced Research Projects Agency, run by the U.S. Department of Defense, explained. “We’re used to looking at an image or video and putting faith in it -- believing the content of the image or the video. And with the rise of the ability to easily manipulate those, that is going to put our faith in visual media in jeopardy.”

Deepfakes began sparking widespread concern last year, when Reddit users began posting fake pornographic videos online, primarily targeting actresses like “Wonderwoman” star Gal Gadot, superimposing the superhero's face onto X-rated content without her permission.

The early fakes were riddled with glitches, but as that technology continues to evolve, some worry they could become indistinguishable from the real deal -– potentially swaying elections, triggering widespread panic, riots – or even a war. It's these worst-case scenarios that have caught the attention of many public officials, from lawmakers to the Department of Defense.

“A lot of times there are some indicators that you can see, particularly if you are trained or used to looking at them. But it is going to get more and more challenging over time, so that is why we developed the media forensics program at DARPA,” Dr. Turek said.

“There is certainly a bit of a cat and mouse game going on,” he added. “For every new manipulation that comes out, we work on additional detecting capabilities and vice versa.”

Dr. Turek’s task at DARPA is developing algorithms that can spot the fake – even when the eye can’t.

How hard is it exactly to detect a deepfake? It depends on the quality. The better the fake, the more digging researchers may have to do.

“We are looking at sophisticated indicators of manipulation from low-level information about the pixels, to metadata associated with the imagery, the physical information that is present in the images or media, and then comparing it [to] information that we know about the outside world,” Dr. Turek said.

While the technology isn’t flawless, there are some very convincing fakes out there – many of them, harmless spoofs of popular films.

But even the creators of deepfake videos that are spoofs of popular films are aware of the concern around them. We reached out to the person behind “Derpfakes," a popular deepfakes YouTube account who claims to be one of the early deepfake innovators.

They did not want to reveal their name, but told ABC that “all deepfakes have a level of controversy attached to them.”

The person behind Derpfakes added “from my perspective the best approach… is to make the public familiar with the idea that what they see is not true by default… Either way, it’s here to stay.”

“I think the challenge is that it is easier to create manipulated images and video and that can be done by an individual now," Dr. Turek explained. "Manipulations that may have required state-level resources or several people and a significant financial effort can now potentially be done at home."

“It is significantly easier that it has been in the past -- for instance, to swap faces across video,” he continued. “That was a CGI technique that special effects houses in Hollywood were able to do years ago, that now can be done potentially at home on a PC.”

The creation of a deepfake is somewhat similar to the state-of-the-art special effects used in today’s filmmaking –- like the face-mapping used to add the late Paul Walker’s likeness to the film “Furious 7.”

But, deepfakes also have a lot in common with technology you’re probably more familiar with: the photo album on your phone that learns your friends’ faces, or “face swapping” on Snapchat.

“If you compare a deepfake to what a person can do on Snapchat on their phone— deepfakes are much more believable,” said Jeff Smith, the associated director of the National Center for Media Forensics at the University of Colorado Denver.

“The challenging thing is that you have to have, at this point in time, pretty good skills with a computer,” he added.

It’s Smith’s job to create convincing deepfakes as part of a UC Denver program for DARPA’s partners – he explains his team acts as “manipulators": “We act as the bad guys, creating manipulated audio, video, and imagery-- as a set of data, as much data as we can provide, so that the algorithms that are being developed under the program can be tested and understood in terms of their performance and limitations and overall, the ability for the detection of manipulated media gets better.”

Smith walked ABC News’ Kyra Phillips through the steps of creating a deepfake –- from gathering information about a person's face through photos and videos, to programming the computers to actually learn a person's features and synthesize a variety of expressions.

PHOTO: Jeff Smith at the University of Colorado Denver walked ABC News' Kyra Phillips through the process of face swapping -- giving it a try.
Jeff Smith at the University of Colorado Denver walked ABC News' Kyra Phillips through the process of face swapping -- giving it a try.
ABC News

“Once we are satisfied with the training or the results as they are created, then we conduct the swap itself,” Smith said, demonstrating a deepfake where he “swaps” faces with Phillips.

But unlike a faceswap on Snapchat, a good deepfake takes time to make.

“Within the first couple hours… it starts to come into focus,” Smith explained, as he demonstrated how the computers learn and then morph the faces. "So as that training process keeps going, the models get better and better and better and it can take again, hours to days."

The technology behind deepfakes is getting better and better as well. "In another year or two, there will be either improvements to this system or a new system that can do the same thing -- maybe even better, maybe faster, maybe more readily available to the public,” Smith said.

In Smith's classroom, his students work to stay ahead of the curve. As they demonstrate their own manipulations, one student named Taylor shows off one of his latest creation: video of a quiet college campus, where he has added in an assault-style rifle.

PHOTO: Jeff Smith's students at the University of Colorado Denver are learning to use the latest technology to identify fake media.
Jeff Smith's students at the University of Colorado Denver are learning to use the latest technology to identify fake media.
ABC News

“I chose a weapon because it’s definitely more shocking, and it makes you more aware of what’s possible with manipulated videos,” the student explained. “So showing a gun on a campus, just sitting there, seems pretty shocking to a viewer and hopefully makes them think more about it.”

His hope is to “help to develop the software to detect fake media, so hopefully people aren’t tricked by videos that you can easily make with just the software we have.”

Another student named Carly hopes to “get users to actually think critically—to actually look at an image and question its authenticity. The awareness of this program is a step in the right direction.”

And while seeing may not necessarily mean believing in the digital age, looking beyond face value may be more important than ever.

This report was featured in the Tuesday, Dec. 11, 2018, episode of ABC News' daily news podcast, "Start Here."

"Start Here" is the flagship daily news podcast from ABC News -- a straightforward look at the day's top stories in 20 minutes. Listen for free every weekday on Apple Podcasts, Google Podcasts, iHeartRadio, Spotify, Stitcher, TuneIn, or the ABC News app. On Amazon Echo, ask Alexa to "Play 'Start Here'" or add the "Start Here" skill to your Flash Briefing. Follow @StartHereABC on Twitter, Facebook and Instagram for exclusive content, show updates and more.