AI-Powered Cameras for School Safety – DTN The Progressive Farmer

Retailers can spot shoplifters in real-time and alert security, or warn of a potential shoplifter. One company, Athena-Security, has cameras that spot when someone has a weapon. And in a bid to help retailers, it recently expanded its capabilities to help identify big spenders when they visit a store.

It’s unknown how many schools have AI-equipped cameras because it’s not being tracked. But Michael Dorn, executive director of Safe Havens International , a nonprofit that advises schools on security, said “quite a few” use Avigilon and Sweden-based Axis Communications equipment “and the feedback has been very good.”

Schools are the largest market for video-surveillance systems in the U.S., estimated at $450 million in 2018, according to London-based IHS Markit, a data and information services company. The overall market for real-time video analytics was estimated at $3.2 billion worldwide in 2018 — and it’s anticipated to grow to more than $9 billion by 2023, according to one estimate .

AI cameras have already been tested by some companies to evaluate consumers’ facial expressions to determine if they’re having a pleasant or unpleasant shopping experience and improve customer service, according to the Center for Democracy and Technology, a Washington nonprofit that advocates for privacy protections. Policy counsel Joseph Jerome said companies may someday use the cameras to estimate someone’s age, which might be useful for liquor stores, or facial-expression analysis to aid in job interviews .

Police in New York, New Orleans and Atlanta all use cameras with AI. In Hartford, Connecticut, the police network of 500 cameras includes some AI-equipped units that can, for example, search hours of video to find people wearing certain clothes, or search for places where a suspicious vehicle was seen.

The power of the systems has sparked privacy concerns.

“The issue is personal autonomy and whether you’ll be able to go around walking in the public square or a shopping mall without tens, hundreds, thousands of people, companies and entities learning things about you,” Jerome said.

“People haven’t really caught up to how broad and deep the technology can now go,” said Jay Stanley, a senior policy analyst at the American Civil Liberties Union who published a research paper in June about how the cameras are being used. “When I explain it, people are pretty amazed and spooked.”

When it comes to the potential for stemming violence that may be less of an issue. Shannon Flounnory, executive director for safety and security for the Fulton County School District, said no privacy concerns have been heard there.

“The events of Parkland kind of changed the game,” he said. “We have not had any arguments or any pushback right now.”

ZeroEyes, a Philadelphia-based company, is testing gun-detection software at Rancocas Valley Regional High School in New Jersey, but they’re not selling their product yet. When they do, they’ll also market to “stadiums, shopping malls — anywhere with a potential for a mass shooting,” said Rob Huberty, company co-founder.

Even supporters of these systems acknowledge the technology is not going to prevent all mass shootings — especially considering how quickly damage is done. But supporters argue they can at least help reduce the number of casualties by giving people more time to seek shelter and providing first responders with information sooner.

“This is just one thing that’s going to help everybody do their job better,” Huberty said.

Both ZeroEyes and Austin-based Athena-Security claim their systems can detect weapons with more than 90 percent accuracy but acknowledge their products haven’t been tested in a real-life scenario. And both systems are unable to detect weapons if they’re covered — a limitation the companies say they are working to overcome.

Stanley, with the ACLU, said there’s reason to be skeptical about their capabilities because AI is still “pretty unreliable at recognizing the complexities of human life.”

Facial recognition is not infallible, and a study last year from Wake Forest University found that some facial-recognition software interprets black faces as appearing angrier than white faces.

But the seemingly endless cycle of mass shootings is compelling consumers to see technology — untested though it may be — as a possible solution to an intractable problem.

After a gunman killed 51 people in attacks at two mosques in New Zealand in March, Athena-Security installed gun-detection cameras at one of the mosques in June. Fahad A.B. Al-Ameri, a Qatari businessman with no affiliation to the mosque, paid for them because “all people should be secure going to their houses of worship,” he said.

Of the 50 clients Athena-Security has, about a fourth are schools, said company co-founder Chris Ciabarra.

“It’s a matter of saving lives,” he said.

(KR)

Read Full Article

Newsletter