The ACLU of Massachusetts, the American Federation of Teachers, Massachusetts chapter, and Massachusetts Teachers Association today sent a letter to 273 school superintendents calling on them to reject biased, dangerous, and unregulated face surveillance technology. The letter asks leaders to keep this dystopian technology out of Massachusetts elementary, secondary, and vocational-technical schools.

“Face surveillance technology has no place in our classrooms, but companies are nonetheless pushing their often-untested systems on schools nationwide,” said Carol Rose, executive director of the ACLU of Massachusetts. “Schools are meant to be safe environments for students to learn, explore, and play. Face surveillance technology threatens that environment, subjecting children and school staff to invasive, unreliable spying and data collection. Massachusetts has the opportunity to demonstrate a real commitment to the well-being of students in the digital age by pledging to keep this technology out of our schools, and we look forward to working with districts statewide to that end.”

Emails obtained by the ACLU of Massachusetts indicate private technology companies consider public schools a potential market for their surveillance products. News reports reveal that Clearview AI, a controversial facial recognition startup with a database of billions of photos scraped from social media and the web, has been tested or implemented in more than 50 educational institutions across 24 states, including Somerset Berkley Regional High School in Massachusetts.

But, according to advocates and educators, face surveillance technologies pose serious privacy and safety risks for students and education workers. Invasive, constant surveillance of children while they are growing up and developing their personalities ratchets up anxiety at a time when students need adults to create a trusting environment, keeping them calm, safe, and feeling accepted.

“Our education dollars should be spent on direct services to our students such as creating small, safe, and nurturing learning environments,” said Beth Kontos, president of AFT Massachusetts. “We need more classroom supports, more counselors, and a nurse in every school. Facial recognition is a faulty and expensive technology that will make our most vulnerable community members feel unsafe at school, and runs the risk of exacerbating the school-to-prison pipeline and harming community trust.”

Face surveillance technology is particularly ill-suited for use on children because these systems are modeled on and optimized for use on adult faces. Research that tested five “top performing commercial-off-the shelf” facial recognition systems shows that these systems “perform poorer on children than on adults.” A recent National Institute of Standards and Technology study likewise found that these systems are much more likely to fail when attempting to identify children, women, and the elderly.

“Teachers and students face real challenges, especially in communities of color and low-income communities,” said Merrie Najimy, president of Massachusetts Teachers Association. “The very last thing we should do as educators is allow ourselves to fall victim to the false idea that dystopian surveillance technologies like facial recognition will strengthen our communities. Instead of adopting biased, privacy invasive technology, school districts should spend these precious financial resources to bring on more staff and build programs that will help all students and families feel safe and included at school. Face surveillance technology threatens directly undermines that trust.”

According to the letter, school use of face surveillance technology raises particularly serious concerns when it is used to track and control students in school districts serving immigrants, people of color, and low-income families, where the school-to-prison pipeline has too often diverted young people from their education and entangled them in the criminal legal system. Studies show face surveillance algorithms reinforce racial and gender biases with inaccuracy rates up to 35 percent for Black women, and generally perform poorer when identifying people of color. In schools, false positives resulting from faulty technology may result in unnecessary interactions with law enforcement, lost class time, disciplinary action, and potentially even a criminal record.

The ACLU of Massachusetts launched “Press Pause on Face Surveillance,” a campaign to build awareness about the civil liberties concerns posed by face surveillance technology and the need to pause government’s use of the technology. An ACLU-backed bill currently before Massachusetts legislators on Beacon Hill would establish a statewide moratorium on government use of face surveillance and other biometric screening technologies until the legislature imposes checks and balances to protect the public’s interest. Meanwhile, the ACLU is working in municipalities like Springfield, Cambridge, Northampton, Brookline, and Somerville to bring this technology under democratic control by introducing and enacting local prohibitions on government use.