Stay Informed: Latest News from Across Georgia
In the classic sci-fi movie โMinority Report,โ Tom Cruise plays a cop whose โPrecrimeโ unit uses surveillance and behavior patterns to arrest murderers before they kill. Set in the future, the movie raised tough questions about privacy, due process, and how predicting criminal behavior can destroy innocent lives.ย
But what once seemed like an action fantasy is nowย creeping into American classrooms.
Today, across the country, public schools are adopting artificial intelligence tools โ including facial recognition cameras, vape detectors, and predictive analytics software โ designed to flag students considered โhigh riskโ โ all in the name of safety. But civil rights advocates warn that these technologies are being disproportionately deployed in Black and low-income schools, without public oversight or legal accountability.
A recentย report from the Center for Law and Social Policyย (CLASP) argues that AI programs and mass surveillance arenโt making schools any safer, but rather quietly expanding the school-to-prison pipeline. And according to author Clarence Okoh, the tools donโt just monitor students โ they criminalize them.
โThe most insidious aspect of youth surveillance in schools is how it deepens and expands the presence of law enforcement in ways that were previously impossible,โ says Okoh, a senior associate at the Georgetown Law Center on Privacy and Technology. โBlack students are being watched before they even act.โ
Surveillance in the Name of School Safety?
The rise of school surveillance didnโt begin with AI, but the advancing technology has taken it to a new scale. According to the National Center of Education Statistics,ย 91% of public schools use security cameras, while more than 80% monitor studentsโ online activity. Yet there is little evidence that these tools improve safety โ and even less to show theyโve been tested for bias.ย
In fact,ย a 2023 Journal of Criminal Justice studyย found that students in โhigh-surveillanceโ schools had lower math scores, fewer college admissions, and higher suspension rates โ with Black students bearing the greatest impact. These systems include facial recognition, social media monitoring, location tracking, as well as vape and gun detection sensors.
โThe line between school and jail is being erased โ not metaphorically, but digitally,โ Okoh says.
In Pasco County, Florida, for example, anย AI program secretly used school recordsย to flag children for future criminal behavior based on grades, attendance, and discipline โ leading to home visits, interrogations, and harassment.ย
โIt wasnโt hypothetical,โ Okoh said. โKids were being watched, tracked, and punished โ and families were being pushed out.โ
Okoh also added that the incident in Pasco wasnโt isolated: โThese tools are being marketed across the country, and the schools most likely to say yes are the ones serving Black and low-income students.โ
Funded by Fear, Backed by Public Dollars
One of theย reportโs most alarming revelationsย was that schools paid for much of this AI surveillance with federal money meant to support students during the COVID-19 pandemic. Okohโs report found that districts spent CARES Act and American Rescue Plan funds to purchase unvetted AI tools. Some vendors even advertised their products as eligible expenses under COVID-era guidelines, including predictive policing tools and vape sensors.
And yet, according to Okoh, most districts fail to assess whether these tools meet federal civil rights obligations before implementation. Okoh warns that this is more than an oversight โ itโs a violation of federal funding laws.
โTitle VI, disability rights, and data privacy laws are supposed to apply to any district that receives federal funds,โ he says. โBut there isnโt a single school I know of that has actually tested these technologies for civil rights compliance before buying them.โ
The Cost of Being Watched
As school surveillance grows, students are increasingly aware theyโre being watched, and Okoh says itโs affecting their well-being.ย ย
โSurveillance is easier than care,โ he recalls one youth saying in a recent focus group. โAnd thatโs the problem. These tools replace trusted relationships with punishment โ and they do so under the guise of safety.โ
Experts say the threat of surveillance isolates and criminalizes students. The more schools invest in these technologies, the fewer dollars go toward counselors, therapists, and restorative justice programs โ the very things known to improve student outcomes.
โThereโs this assumption that tech is neutral,โ Okoh says. โBut itโs not. These systems are built on data that already reflect bias, and then they turn that bias into decisions about who to monitor, who to discipline, and who to exclude.โ
Even when the technology fails, Okoh says, the damage is already done: โYoung people are saying, โIf you truly cared about us, you wouldnโt be watching us โ youโd be investing in us,โโ he says.
A Youth-Led Vision of Safety
Okoh and the coalition he co-founded,ย NOTICE, or NoTech Criminalization in Education, are working to create a safe environment without relying on surveillance. Their vision centers on mental health supports, youth-led crisis response teams, peer mentorship, and restorative justice.
โBlack students deserve schools that trust them, not track them,โ he says. โTheyโre not asking for a world without accountability. Theyโre asking for one where theyโre not criminalized just for being who they are.โ
Heโs also calling on policymakers to ban surveillance tech in schools outright โ particularly tools that monitor biometric data, social media, or behavior without transparency or due process. And he urges the Black community to stay engaged and vigilant on the issue.
โThese contracts pass quietly through school board meetings,โ Okoh adds. โWe need parents, educators, and students to all show up for our kids and say, โOur schools are not testing labs for private tech companies.โโ
Related
Read the full article on the original site