Last month, defense officials saw a demonstration at the Blue Grass Army Depot of smart-camera technology meant to better detect active shooters.
The depot is the first army base to test SCYLLA, a tool that uses machine learning to immediately detect weapons, faces and “abnormal behavior” like violence.
Chris Willoughby is the depot’s electronic security systems director. He says it can be installed on security cameras and drones, and analyzes footage in real time.
“With that, we can build into systems to lock doors, send dispatch reports, send information and screen snippets to police to make a holistic solution,” Willoughby said.
The tool is available commercially, but demonstrations at the Blue Grass Army Depot mark the first time it’s being tested for military use.
Willoughby says it’s meant to allow a much quicker response to potential threats.
“You imagine picking up a gun 200 feet away off of a school, and automatically sending an alert to the cops immediately, and autonomously locking the doors all at the same time and letting everybody know what's going on,” Willoughby said.
The tech is branded as “artificial intelligence,” but Willoughby says it’s different from generative AI programs like ChatGPT.
“It basically looks at pixels on the screen. When they move, the pixels change, and it sends you an alert. So it's not truly artificial intelligence, it's video based analytics,” Willoughby said.
Local leaders, like Berea Independent School District Superintendent Diane Hatchett, visited the demonstration to see if the technology could potentially be used in local school systems.
“Kids are our most precious resource, so that's why I'm here,” Hatchett said. “Anything I can learn that can help protect our students is worth my time. That's why I'm here. I want to learn as much as I can.”
But it also brought national officials to the depot to see it in action. It was part of a national Department of Defense research and development initiative to bolster security on military installations.
Department of Defense official Drew Walter says the agency would like to see the technology adopted widely if it proves successful.
“We are hopeful that this, among other technologies, could be installed across the board, whether that's at the Pentagon or at an army base or an Air Force installation, whether within the United States or overseas,” Walter said.
But, the tech has seen some pushback from institutions like the American Civil Liberties Union.
Opponents like Jay Stanley, ACLU senior policy analyst, have voiced concerns about accuracy, and the trade-off of privacy for security.
“We know that AI is very inaccurate and quirky, and so one set of questions arise around where, if a person is holding a cell phone or some other objects – maybe a drill or what have you – and the AI thinks it's a gun, how is that handled? Is there bias?” Stanley said.
But, defense officials like Walter say they have full faith in the program.
“We've been funding this program for a few years now,” Walter said. “We're really excited to see the progress, and we're hopeful that this, combined with many other technologies, can really get after those threats and prevent lives lost, prevent damage to our facilities and high value assets.”
The DoD’s goal is to have the technology detect hazards at a 96 percent rate, with as little room for false alarms as possible.
The Department will continue to test and evaluate the process to make a final decision if the smart-camera tech is worth implementing.
** WEKU is working hard to be a leading source for public service, and fact-based journalism. Monthly supporters are the top funding source for this growing nonprofit news organization. Please join others in your community who support WEKU by making your donation.