In an effort to prevent violence on campus, Oak Lawn High School in Chicago’s south suburbs recently installed artificial intelligence software designed to detect guns both inside and outside the high school, the Chicago Tribune reports.

Superintendent Shahe Bagdasarian said it was an easy decision to apply for a grant through Omnilert, a company that aims to prevent school shootings and violence through innovative technologies.
Oak Lawn’s proactive step reflects a broader shift: districts are increasingly turning to AI-enhanced surveillance to detect and deter threats. Driven by both technological innovation and state-level funding opportunities, AI gun detection is now more than a “niche” solution; it’s stepping into the mainstream of school safety strategies. But reliance on such technology raises questions about accuracy, privacy, and allocation of resources. Debates are already stirring as the systems spread.
“We need money to try these things,” the Tribune quoted Mr Bagdasarian as saying.
Over the last few years, several states have approved AI gun-detection investments. Kansas proposed $5 million in grants, with ZeroEyes effectively being the only vendor meeting the stringent deployment and technical criteria. Florida, Missouri, Michigan, and others are pursuing similar plans, indicating a rapid evolution toward AI-based weapons detection in schools, according to a report filed by the Associated Press.
And other local school districts across the US are exploring or launching these systems. For example, Putnam City Schools in Oklahoma recently became the first large district in the state to adopt ZeroEyes, demonstrating that AI-based gun detection is expanding even outside urban centers, according to a press release from the district.
Critics of AI-based gun detection systems in schools argue that these tools create more problems than they solve. They warn that adding AI to surveillance cameras risks normalizing constant monitoring of students, which can erode privacy and create a prison-like atmosphere in schools. Concerns also extend to the accuracy of these systems: false positives, where everyday objects like umbrellas or cell phones are mistaken for guns, could trigger unnecessary lockdowns or even harmful encounters with law enforcement. At the same time, these systems aren’t foolproof. High-profile cases have shown that guns can still go undetected if not clearly visible on camera, raising doubts about whether schools are truly safer with them in place.
Beyond technical reliability, many critics focus on priorities and equity. They point out that such technologies are costly and often divert limited school funding away from mental health services, counseling, and community-building initiatives that may address root causes of violence more effectively. Civil rights advocates further warn that surveillance systems can disproportionately harm marginalized students, who are more likely to be misidentified or face increased disciplinary action. In short, opponents caution that while AI gun detection may promise faster responses, it risks offering a false sense of security while ignoring deeper, more human-centered approaches to school safety.