I regret that I learned about the fifth Global Fact Checking Summit – indeed, that I learned that there was such a thing as a global fact-checking summit – only after it took place last month. It sure would have been a good excuse to get myself to Rome. (What? There are other excuses?) The summit was sponsored by the Poynter Institute, a nonprofit journalism school in St. Petersburg, Florida, and one of the panels I’d have liked to attend was devoted to “the coming fake videogeddon,” which will be brought to us (or not) by a technology known by the convenient nickname deepfake.
Deepfake videos are a concern for fact-checkers, but they're not as easy to create as the media let on. We should know — we tried to create one https://t.co/fEQEglSLoe #GlobalFactV
— IFCN (@factchecknet) June 20, 2018
Deepfake emerged in late 2017 thanks to the oeuvre of a Redditor who called himself “deepfakes” and who posted a video in which he’d (I’d be willing to bet real money it’s a he) used a machine-learning algorithm to superimpose the face of Wonder Woman actress Gal Gadot onto the body of a porn actress. He followed up, according to a December 2017 Motherboard story by Samantha Cole, with “hardcore porn videos featuring the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza.”
Cole wrote that the software employed by “deepfakes”
is based on multiple open-source libraries, like Keras with TensorFlow backend. To compile the celebrities’ faces, deepfakes said he used Google image search, stock photos, and YouTube videos. Deep learning consists of networks of interconnected nodes that autonomously run computations on input data. In this case, he trained the algorithm on porn videos and Gal Gadot’s face. After enough of this “training,” the nodes arrange themselves to complete a particular task, like convincingly manipulating video on the fly.
There had been earlier, less-polished attempts at realistic video manipulation, including a video/audio mashup of former President Barack Obama that was created last summer at the University of Washington.
“All of this should freak you out,” Damon Beres and Marcus Gilmer wrote in a February 2018 Mashable story:
There’s not much legal recourse for those who fall victim to this new technology, according to Jonathan Masur, a professor who specializes in patent and technology law at the University of Chicago Law School. That's true even for private citizens.
“There’s the copyright claim, if you took the [footage] yourself. There’s the defamation claim if someone tries to say that it’s actually you. And if you're a celebrity, there'’ a right to publicity claim if someone is trying to make money off of it,” Masur explained. “But each of those is just a narrow slice of what’s going on here that won’t cover the vast majority of situations.”
What happens in porn doesn’t stay in porn: Deepfake videos and images have proliferated – and they’re improving, reports Gizmodo. One of the most recent innovations, Deep Video Portraits, uses a “source actor” to create input videos that in turn are used to “reanimate” a portrait of a target actor.
Deepfake would appear to owe some of its etymology to “Deep Throat” – both the 1972 porn film and the pseudonym of the secretive Watergate source later revealed to be FBI Associate Director Mark Felt. Other currently vogueish deep compounds include deep state (whose American meaning was defined in 2014 by a former Republican congressman, and which has been the object of a Trump Administration conspiracy theory) and deep cut (a song judged to be less commercial than others on an album).
For more on fake, see my November 2016 word-of-the-week post. Related: uncanny valley.
Comments
You can follow this conversation by subscribing to the comment feed for this post.