education

Invading student privacy will not keep schools safer from shootings | Lori Bezahler

[ad_1]

This year, students in Florida headed back to school for reading, writing and a new Big Brother. The Florida Schools Safety Portal, a statewide database, will collect, sort and analyze sensitive data about students to share with law enforcement. Created in response to the tragedy at Marjory Stoneman Douglas high school in Parkland, the portal is described as an early warning system to identify and assess potential threats. But responding to legitimate concerns about school shootings with a system that invades student privacy and labels children as threats will not make schools safer.

The desire to find a foolproof way to protect children in schools is not new. Just months after the Columbine high school shooting in 1999, I sat in a conference room while security industry salespeople pitched an “early warning system” that would “scientifically” identify “at-risk students”, a predictive analytics system and precursor to the one being pushed in Florida.

Similar systems are widely used by law enforcement, regardless of considerable evidence that they generate biased results, often unfairly targeting people of color. Despite their inherent bias, police forces use them to make decisions about deploying officers and building a list of potential suspects. Courts use them to determine prison sentences. Customs and Border Protection is installing facial recognition systems at airports and using them to scan state license databases.

All this is done in spite of obvious privacy and civil rights concerns – not to mention evidence that these systems aren’t even accurate. A researcher with the MIT Media Lab found racial and gender bias embedded in the very code that runs predictive data systems, including not recognizing faces of dark-skinned users and registering the eyes of Asian people as closed when they are in fact open.

And now some school officials want to put this problematic technology in schools. Children as young as four could be forced to share personal and biometric data with law enforcement and potentially immigration services. In addition to invading the privacy of students, this technology criminalizes innocent adolescent behavior and life conditions such as homelessness.

It also facilitates the further targeting of black and brown students in systems that already penalize them at higher rates than their white peers. Several studies have shown that black and Latino children are routinely perceived as more dangerous than their white peers, regardless of their behavior. Black students are not more likely to misbehave than white students, yet they are more likely to be suspended, receive corporal punishment or have a school-related arrest. Increased police presence has not been shown to make schools safer, but it does put black and brown students at greater risk of suspension and expulsion. Students at schools with police are five times more likely to be arrested for disorderly conduct than students in schools without police. Predictive analytics and technological surveillance will no doubt exacerbate those disparities.

In Saint Paul, Minnesota, the Coalition to Stop the Cradle to Prison Algorithm, a group of parents, educators and civil rights advocacy organizations, successfully blocked a joint powers agreement that would have shared student data across educational and law enforcement systems, feeding that data into a mathematical algorithm to assess children’s likelihood of violence. San Francisco and Oakland, cities full of tech workers who understand these systems better than most, have banned facial recognition technology. And the New York state legislature is considering a bill to ban the use of this technology in schools after the 4,400 student Lockport school district spent $4m on a facial recognition system.

The security technology market is capitalizing on fears about school safety to sell unproven, costly surveillance systems that put students, particularly students of color, at risk. The implications of using an unregulated system of data collection combined with biased and inaccurate surveillance tech on schoolchildren is not only alarming, but frankly dystopian.

Policing, surveillance and biased algorithms undermine a culture where students feel valued and respected. Schools are safer when they prioritize the relationships between students and teachers, provide counselors and engage in social and emotional supports.

Across the country communities are demanding a holistic approach to safety where students of color, LGBTQ and gender nonconforming students, immigrant students, and students with disabilities are protected. With chants of “counselors not cops”, they are calling for interventions that acknowledge the roots of violence and oppression, rather than criminalize them.

Policymakers and school officials concerned about the safety of children in schools must be vigilant in protecting them from unproven interventions that are unlikely to keep them safe.

  • Lori Bezahler is the CEO of the Edward W Hazen Foundation, a private foundation committed to supporting organizing and leadership of young people and communities of color in dismantling structural inequity based on race and class

[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more