Just a few, mere months ago, everyone was freaking out about deepfakes. To refresh, a deepfake is like a digital puppet that uses machine learning and 3D models of a target’s face, which allows you to manipulate a person in a video.
Numerous apps are now available that allow anyone (yes, a shout out to you, bad actors) to create a deepfake. The consensus from cybersecurity experts is that at some point during the 2020 election, deepfakes will play some sort of role. Blah.
Sure, we see the urgency to detect deepfakes in regard to a political election that will determine the most powerful person in the world. But what are other companies doing to counter deepfakes—especially companies, for which, undetected deepfakes could take down their entire business model?
Take the Dare App, Eristica. The app, allows people to put up cash, create challenges and record said challenges. Those who win challenges get paid in cryptocurrency. The premise is pretty straightforward and simple—somewhat similar to another site which originated a few years back, called Record Setter. (Here I am setting a world record for: Most Eyebrow Raises While Listening to a Recording of Jeff Daniels Reading the Gettysburg Address.)
Except now—real money is involved.
“I put up a $100 and say, ‘Hey, I bet you can’t do a kick-flip on a snowboard,'” Nikita Akimov, CEO and founder of Eristica, explained to Observer. “If that person submits a video, and the community votes that person up… then that person wins the money.”
The challenges on Eristica extend beyond kick-flips; it also extends into the terrain of drinking an egg through your nose, running on a path of fire, eating a maggot-filled omelet and having a person shoot fireworks at his ass. (Holy shades of legal clearances!)
OK, so you can clearly see how deepfake videos could bring down Eristica’s entire credibility and business plan—especially since people are putting up real money for these challenges.
So, how does Eristica combat deepfakes?
“Detecting deepfakes is kind of easy, because on blockchain, you can see everyone’s transactions,” explained Akimov.
Via the transactions, the company can detect such things as betting irregularities (a person who has been doing $3 challenges is suddenly betting $10,000), or suddenly creating a high-stakes challenge outside of their regular user pattern.
“Our algorithm crawls our profiles and tries to find those drastic changes,” Akimov explained, stating that this is the first red flag that a video might be a deepfake, “[such as] if you’re betting on something, and then something drastically changes in your behavior.”
Yes, that’s the first step in deepfake-countering for the company. Eristica also uses the same algorithm as Instagram and YouTube to detect copyrighted content from both the video and audio files. Further, they use a third-party search algorithm, which cuts the video file into frames and tries to locate the original source file using ‘image search.’
Apart from that, every time a user engages with the Eristica community, all of their activities and counterparties go through a manual moderation to verify the legitimacy of the user and the content.
“Therefore, it’s extremely unlikely that a deepfake video would make it past all of these screening processes on the Eristica app,” Akimov said.
OK, this method might be great to detect a fraud video of someone eating an omelet filled with maggots, but can this methodology be used during the 2020 campaign—in which the bar, for the legitimacy of a video, is much higher?
“AI (artificial intelligence) and behavior analysis has come a long way, it’s something that is used widely these days,” Akimov continued. “Politics might be a tough one since ‘typical behavior’ is not something that is present usually—at least these days.”
“Banking/financial industries or marketing industries are a much better fit for this,” he added.
So rest assured, America; we might not immediately be able to detect a deepfake of a 2020 candidate, but our maggot-filled omelet challenges will remain legit.