Despite the outcry over privacy, New York District starts testing facial recognition system on public school students, staff

Tuesday, December 3, 2019
By Paul Martin

by: Lance D Johnson
NaturalNews.com
Monday, December 02, 2019

The Lockport City School District in upstate New York has spent approximately $4.2 million on the installation of 417 cameras in its high school, middle school and six elementary schools. The new surveillance system will use facial recognition software to monitor students and staff on a day-to-day basis. The school district’s goal is to alert law enforcement when sex offenders and other criminals enter the premises. The cameras will also detect whether illicit drugs, firearms or other weapons are present on campus. Officials hope to mitigate crime and stop potential shooters, but the technology presents several privacy challenges.

School district to monitor students and target subjects using a database and intrusive facial recognition system

Lockport City is the first to roll out such an invasive system. The district is using Aegis camera systems, which were installed back in October 2018. The district’s automated facial recognition software was activated on June 3, 2019 and became operational across all their schools in September. The sophisticated fleet of cameras is now scanning student and faculty faces in real time, tracking their movements, reading their expressions, and analyzing their motivations.

District officials can enter any video footage or personal identifiable information into the database. The cameras will cross reference people’s faces in real time with information that was uploaded into the database. The system is developed by SN Technologies Corp. and implements proprietary software that can also identify several types of firearms in a crowded room.

Despite its potential to thwart violence and illicit activity, parents, privacy advocates, and some legislators are pushing back against the invasive facial recognition system. Privacy experts warn that the technology can be misused and that it represents a gross violation of individual privacy rights. For one, the system will allow officials to target select students and members of the community. Power-hungry superintendents and principals will be able to treat select students as subjects and track their whereabouts. School officials will be able to enter any source material into the database, including images of students they expelled, people they do not like in the community, or mug shots from the local jail. When their fleet of cameras scans these faces in real time, they are cross-referenced with photos in the database. When the software finds a match, a warning is sent to school authorities.

School officials will also be able to work with local law enforcement when the cameras detect an individual who is barred from campus. Those barred from campus may include former employees that school officials did not like. School officials will be able to place suspicious students or potential gang members in the database, targeting individuals without any due process. Furthermore, the public will not be allowed to access the database and find out if they are being monitored or singled out.

Facial recognition software generates a unique profile of the individual, precisely measuring the length of the jaw line, the distance between the eyes, the depth of eye sockets, the width of a person’s nose, and so much more. More than eighty nodal points are mapped on a person’s face. These measurements can help authorities detect the whereabouts of an individual in real time. The software compares these images to an ever-expanding database of images to find a match. Artificial intelligence uses these precise measurements to generate a black and white profile of each person’s face so they can be spotted in a crowd and quickly recognized in a database. Armed with this technology, authorities can monitor and track whomever they want.

Facial recognition is not perfect either. In fact, when facial recognition is used by law enforcement, it is wrong about .8 percent of the time. For every 1,000 facial scans generated, there will be an average of eight people falsely identified. An innocent person can easily be convicted of a crime.

The Rest…HERE

Leave a Reply

Join the revolution in 2018. Revolution Radio is 100% volunteer ran. Any contributions are greatly appreciated. God bless!

Follow us on Twitter