As we creep closer to a dystopian police surveillance state, it feels like our every move, action and nuance is now being monitored. It could be the digital footprint we leave via our smartphone or law enforcement using live facial recognition technology.
Fun Orwellian Stat: one billion surveillance cameras will be going live by 2021. That’s a hell of a lot of surveillance.
SEE ALSO: How LSD & Nuclear Weapons Led to Facial Recognition Tech
Meanwhile, the city of San Francisco has banned the use of facial recognition technology by the police.As astutely pointed out by the ACLU, “People should be free to walk down the street without being watched by the government.” At least, for now, San Franciscans won’t have to become characters in a real-life Black Mirror episode.
And worse, there’s a lot of evil profiling using facial recognition—I’m talking to you Amazon Rekognition, which mistook Boston athletes of color for criminals.
Still, artificial intelligence (AI) and facial recognition always seem to get lumped together in the same Orwellian package. Austin startup Athena Security’s mission is to make existing dumb cameras smart, basically taking the bias and nefarious profiling out of the equation by having smart AI security cameras that only looks for guns and weapons.
“Our gun detection system connects directly to your current security camera system to deliver fast, accurate threat detection—including guns, knives and aggressive action,” the company states on its website. “Our system can also alert you to falls, accidents and unwelcome visitors.”
“We use AI—but a form of computer vision and object detection that’s 99+% accurate in spotting guns, weapons and criminal acts,” Lisa Falzone, CEO of Athena Security, told Observer. “We don’t care about personal information because our mission is to instantly spot a gun so police, medical and on-site staff can act to try and mitigate a shooting and potentially save lives.”
Falzone stated that the technology doesn’t profile; it’s more like a smoke detector for fire prevention, but instead, it alerts the security system when a gun is in view.
“Since we don’t care about personal info or PII (personally identifiable information) and we don’t use facial recognition, there is no bias present,” said Falzone. “It’s all about accuracy and object detection and alerting.”
Blurring faces and personal ID is another straightforward action Athena employs to keep things ethical—and to eliminate racial basis.
“We also focus on keeping data on the premises and not using the cloud, we extend our focus on objects as far as possible,” explained Falzone. And, unlike Facebook, “our mission is to save lives, not make a buck on third-party initiatives or selling people’s data or info.”
“Using mass shootings and horrific terrorist attacks as our north star, we focus on mitigating and preventing human loss,” she added.
The nuts-and-bolts behind this AI gun-detection tech: it plugs into existing security camera systems. It’s essentially a black box that contains the “Computer Vision” brain, which sends instant alerts when a weapon is detected. The alerts are then tracked via an app, which will show the user the actual video to understand exactly what is going on and then act accordingly.
“The ‘Computer Vision’ brain was trained by professional actors, acting out scenes—crucial in overcoming the false positives that plague other vendors,” said Falzone. “Our system also continues to learn on the job and gets better with age, constantly building up hours and hours of on-the-job accuracy seeing objects and determining what is a gun, weapon or criminal act.”
Sill, we’ve heard so many horror stories of police shooting a suspect because they thought a cell phone in the suspect’s hand was a gun. How does Athena’s tech prevent these horrific scenarios from happening?
“We’re 99+% accurate, able to recognize and differentiate a cell phone from a gun. The system’s constantly taking in more and more feedback, learning about every possible shape and configuration of objects so that mistakes don’t happen,” said Falzone. “There is a reality check by the user, the video can be seen and verified to further eliminate the chance that someone is misjudged as wielding a gun, and we can also put a human layer of intelligence in if the client desires.”
So this is where we’re at with our security tech in the United States of America—where mass shootings have now sadly become a common stitch in the fabric of our daily society.
“While gun control legislation to solve mass shootings and terrorism continues to stall, Athena’s approach is a non-political solution to help save lives, true effective change happening now,” Falzone concluded.
Obviously, we applaud tech that can prevent yet another shooting tragedy and doesn’t employ racial basis in the equation. Still, there’re so many states now where it’s legal to openly carry guns. And we’re always hearing these stories of NRA supporters going into restaurants and public spaces with say, a visible firearm strapped to their back, because it’s perfectly legal in those locales.
In the 31 states that allow open carry, this must send the Athena security system ringing off the hook, like a noisy carnival midway.
Maybe it’s time to create an AI device that can detect when a good guy with a gun decides to become a bad guy with a gun?