Lindsey Nicholson/Education Images/Universal Images Group via Getty Images
- An AI-powered weapons scanner meant to create “weapons-free zones,” fails to detect knives.
- A New York school district bought the nearly $4 million system and then found out it didn’t fully work.
- A 17-year-old attacker managed to walk through the scanner last year with a nine-inch knife and stab another student.
There is both great fear and great excitement over the future of AI.
There is also a lingering question over its ability.
One New York school district learned this the hard way. It spent close to $4 million to buy an AI-powered weapons scanner from Evolv Technology that the company bills as “proven artificial intelligence” able to create a “weapons-free zone.”
Then, on Halloween last year, a student walked through the scanner carrying a nine-inch knife and used it to stab a fellow student multiple times, according to the BBC.
Teachers broke up the altercation, local news station WKTV reported, and the victim was taken to the hospital. Police at the time said the student’s injuries were not life-threatening.
“When we viewed the horrific video, we all asked the same question. How did the student get the knife into the school?” Brian Nolan, the superintendent of Utica Schools, told the BBC.
The answer appears to be that artificial intelligence is not yet foolproof.
A BBC investigation found that while Evolv Technologies claims their systems can detect guns, knives, and explosive devices, during 24 walk-throughs a scanner missed 42% of large knives.
Evolv Technologies claims on its website that its weapons detection system can scan for weapons 10 times faster than traditional metal detectors. The company did not respond to Insider’s request for comment at the time of publication.
Co-founder Anil Chitkara told WRAL, a news station in North Carolina, that the “AI algorithm is trained on thousands and thousands of different items, different weapons, different guns and also, different personal items, phones, keys, and other things.”
Chitkara explained that when someone walks through it, the system sends an alert if there is something suspicious. It takes a photo and places a red box over the area where it detected the suspicious item. An officer can then search if there is or isn’t a weapon.
Despite its failure to accurately identify weapons like long knives, Evolv Technology’s scanner is being used in hundreds of schools, according to its website. The technology is also used in stadiums and theme parks like Six Flags.
The post A nearly $4 million AI-powered weapons scanner sold to a New York school system failed to detect knives first appeared on The News And Times – thenewsandtimes.com.