Home | WebMail |

      Calgary | Regions | Local Traffic Report | Advertise on Action News | Contact

Posted: 2019-06-12T20:03:21Z | Updated: 2019-06-12T23:22:34Z

Its November 2020, only days before the presidential election. Early voting is underway in several states as a video suddenly spreads across social media. One of the candidates has disclosed a dire cancer diagnosis, and is making an urgent plea: Im too sick to lead. Please, dont vote for me. The video is quickly revealed to be a computer-generated hoax, but the damage is done especially as trolls eagerly push the line that the video is actually real, and the candidate has just changed her mind.

Such a scenario, while seemingly absurd, would actually be possible to achieve using a deepfake, a doctored video in which a person can be made to appear as if theyre doing and saying anything. Experts are issuing increasingly urgent warnings about the advance of deepfake technology both the realistic nature of these videos, and the ease with which even amateurs can create them. The possibilities could bend reality in terrifying ways. Public figures could be shown committing scandalous acts. Random women could be inserted into porn videos. Newscasters could announce the start of a nonexistent nuclear war. Deepfake technology threatens to provoke a genuine civic crisis, as people lose faith that anything they see is real.

House lawmakers will convene on Thursday for the first time to discuss the weaponization of deepfakes, and world leaders have begun to take notice.

People can duplicate me speaking and saying anything. And it sounds like me and it looks like Im saying it and its a complete fabrication , former President Barack Obama said at a recent forum. The marketplace of ideas that is the basis of our democratic practice has difficulty working if we dont have some common baseline of whats true and whats not. He was featured in a viral video about deepfakes that portrays him calling his successor a total and complete dipshit .

How Deepfakes Are Made

Directors have long used video and audio manipulation to trick viewers watching scenes with people who didnt actually participate in filming. Peter Cushing, the English actor who played Star Wars villain Grand Moff Tarkin before his death in 1994, reappeared posthumously in the 2016 epic Rogue One: A Star Wars Story. The Fast and the Furious star Paul Walker, who died before the series seventh movie was complete, still appeared throughout the film through deepfake-style spoofing. And showrunners for The Sopranos had to create scenes with Nancy Marchand to close her storyline as Tonys scornful mother, after Marchand died between the second and third seasons of the show.

Thanks to major strides in the artificial intelligence software behind deepfakes, this kind of technology is more accessible than ever.

Heres how it works: Machine-learning algorithms are trained to use a dataset of videos and images of a specific individual to generate a virtual model of their face that can be manipulated and superimposed. One persons face can be swapped onto another persons head, like this video of Steve Buscemi with Jennifer Lawrences body, or a persons face can be toyed with on their own head, like this video of President Donald Trump disputing the veracity of climate change, or this one of Facebook CEO Mark Zuckerberg saying he controls the future. Peoples voices can also be imitated with advanced technology. Using just a few minutes of audio, firms such as Cambridge-based Modulate.ai can create voice skins for individuals that can then be manipulated to say anything.

It may sound complicated, but its rapidly getting easier. Researchers at Samsungs AI Center in Moscow have already found a way to generate believable deepfakes with a relatively small dataset of subject imagery potentially even a single image, according to their recent report . Even the Mona Lisa can be manipulated to look like shes come to life:

There are also free apps online that allow ordinary people with limited video-editing experience to create simple deepfakes. As such tools continue to improve, amateur deepfakes are becoming more and more convincing, noted Britt Paris, a media manipulation researcher at Data & Society Research Institute .

Before the advent of these free software applications that allow anyone with a little bit of machine-learning experience to do it, it was pretty much exclusively entertainment industry professionals and computer scientists who could do it, she said. Now, as these applications are free and available to the public, theyve taken on a life of their own.

The ease and speed with which deepfakes can now be created is alarming, said Edward Delp, the director of the Video and Imaging Processing Laboratory at Purdue University. Hes one of several media forensics researchers who are working to develop algorithms capable of detecting deepfakes as part of a government-led effort to defend against a new wave of disinformation.

Its scary, Delp said. Its going to be an arms race.