Last month, we predicted that it was only a short matter of time before deepfakes would have their day in court. And that day was Monday, when a law went into effect in Virginia regarding the sharing of deepfake pornography.
Pop quiz: What’s a deepfake? Good question. A deepfake uses machine learning and 3D models of a target’s face, which allows you to manipulate a person in a video, as well as edit and change what they say. Think of a deepfake like a digital puppet, in which these altered changes appear to have a seamless audio-visual flow without jump cuts.
SEE ALSO: Why AI Deepfakes Should Scare the Living Bejeezus Out of You
See any dangers there?
We already reported how deepfakes are going to play a crucial role in the 2020 election—especially if used for the purpose of political evil. Just from a political standpoint, people are freaking out about the potential deepfakes could have to steer an election, particularly if controversial words are put into a candidate’s mouth—and passed off as real—and then, the video go crazy viral with in minutes.
But there’s also the whole aspect of creating deepfakes pornography, which is why the gavel was brought down this week in Virginia.
Yes, Old Dominion has taken the legal lead by being the first states to outlaw the sharing of this type of computer-generated pornography, considering that its still the Wild West when it comes to this new form of manipulated technology.
Take note, since 2014, Virginia has had a revenge porn law on the books, which basically states that it’s illegal to share nude photos of someone without their permission. Those found guilty of this misdemeanor crime face up to a year in jail and a $2,500 fine.
Makes perfect sense, right?
The newly passed Virginia law now covers fabricated images and videos, both those manipulated by Photoshop, as well as deepfakes, which are getting harder and harder to spot as unreal.
Need an example? Deepfakes have now become an entire porn category, with videos in which famous celebrities faces are put onto porn stars’ bodies to make it appear like the A-lister is having sex on camera when they weren’t.
For very little money, anyone can create a deepfake on their home computer. Before it was taken down last week, the DeepNude app allowed anyone to create a rendered nude image of a clothed woman from an existing photo for $50. (Cracked versions of the app can still be found in online forums.)
So just imagine what type of twisted manipulated video concoction could be manufactured by an angry jilted ex-boyfriend (or ex-girlfriend) who utilizes deepfake technology for revenge porn purposes. And imagine how that would impact the target of the video’s life if sent to a workplace or family member. Yikes.
This potential for evil purposes had gotten the attention of other lawmakers. Last month, the Deepfakes Accountability Act was introduced in Congress by Rep. Yvette Clarke (D-N.Y.), and while it might be a little far-reaching, the bill would criminalize manipulated media that is broadcast or shared as the real thing. Deepfakes would have to disclose that the original video was altered by using “irremovable digital watermarks, as well as textual descriptions.” If not, the sharing of a deepfake would be considered a crime.
The bill, though, was opposed by the Motion Picture Association of America (MPAA) which foresees a future of making biopics featuring digital replicas of dead celebrities. The MPAA felt that the wording of the bill was too vague.
Maybe just don’t have any dead celebrities in these biopics do any porn scenes… and all should be fine?