Instead of focussing on the election chaos the technology could cause, politicians should also consider the way it's already doing harmby Morgan Meaker / November 12, 2019 / Leave a comment
This morning a video of Boris Johnson was uploaded to Twitter. The leader of the Conservatives leaned earnestly into the camera as he endorsed “his worthy opponent,” opposition Labour leader Jeremy Corbyn. “Only he, not I, can make Britain great again,” says Johnson, before explaining why his voice sounds slightly disembodied. “I am a fake, a deepfake to be precise.”
The stunt by think tank Future Advocacy—they released a video of Corbyn, too—is designed to draw attention to an unregulated world of synthetic video ready to manipulate our elections.
While the technology could manipulate elections, however, it hasn’t yet. Video trickery in politics has so-far avoided outright fabrication. While the Conservatives edited Keir Starmer interview and the video of American Democrat Nancy Pelosi, where her speech had been slowed to sound drunk are both examples of slippery editing—or “shallow fakes”—they are not deepfakes.
“My general view is that we still have a way to go before deepfakes become a real problem in elections, and at the moment we should be more concerned about ‘shallow-fakes’ and the many—many—other problems that the digital environment poses to our existing electoral systems,” Martin Moore, author of Democracy Hacked: Political Turmoil and Information Warfare in the Digital Age, told Prospect.
That reality has not stopped artists, AI companies and researchers from producing their own deepfakes to warn of what could be coming. But focusing entirely on the electoral disruption deepfakes could cause distracts from an issue that has already arrived. The technology not yet a problem in politics, but it is somewhere else: pornography.
When DeepTrace Lab—a cyber-security start-up based in Amsterdam—embarked on its yearly study of deepfake videos online, they found the number had almost doubled since last year, reaching 14,678. 96 per cent of those videos featured pornography; all of it targeting women.
Danielle Citron, a law professor at Boston University, calls deepfake pornography, made without consent, “an invasion of sexual privacy.” In a WIRED article, the researcher said this was not new. Instead, it points to how misogyny evolves with technology. “At each stage we’ve seen that people use what’s ready and at hand to torment women. Deepfakes are an illustration of that.”
Right now, 99 per cent of that pornography features women working in entertainment; British…