Police are copying and pasting body parts in face recognition searches
In two reports published this week, Georgetown University Law School’s Center on Privacy and Technology joins the ACLU of Massachusetts in calling for a moratorium on the government’s use of face surveillance technology, citing alarming new findings about law enforcement’s use of the tool nationwide.
The reports document disturbing and sometimes bizarre law enforcement uses of the unregulated, often inaccurate technology. In many cases, the Georgetown scholars say, police are cutting and pasting parts of faces onto suspect images to generate face recognition “hits.” In others, officials are encouraged to run police sketches of suspects through face surveillance algorithms to try to identify people. Astonishingly, in New York, the scholars find, police have on more than one occasion substituted images of famous people including Woody Harrelson and a Knicks player in place of suspect photos when the photos are too grainy to be useful for face recognition purposes.
In some cases, these practices have led police to identify people who were ultimately arrested and prosecuted. According to NYPD figures, police have made nearly 3,000 arrests based on these identifications. But despite its growing use in law enforcement, criminal defendants are almost never notified that face surveillance was used to identify them as suspects, raising significant constitutional due process concerns.
The reports also provide the most comprehensive view the public has seen yet of the development of city-wide, real-time face surveillance and tracking operations in Detroit and Chicago. Neither Michigan nor Illinois has a law regulating law enforcement’s use of the technology, which multiple studies have shown is highly inaccurate when evaluating people of color and women’s faces. Neither police department has policy protections in place to adequately guard residents of those cities against dystopian surveillance and tracking of everyday life or political speech, or to prevent racial profiling.
Earlier this month, the ACLU published emails between a face surveillance vendor called Suspect Technologies and the Plymouth, Massachusetts police department. The emails sketch out the contours of a plan, in the works for two years, to use Suspect’s technology to track the movements and behaviors of people in 21 or 22 public buildings around the town of 60,000 people, including in schools. In the emails, Suspect Technologies CEO Jacob Sniff acknowledges that his product may work only 30 percent of the time, and admits that it might produce as many as one false positive “hit” per day. After the ACLU obtained the emails and notified journalists about them, the Chief of Police in Plymouth backed away from the plans.
In response to the Georgetown reports, ACLU of Massachusetts executive director Carol Rose said: “People often think of facial recognition as objective science, but Georgetown’s findings show law enforcement is using the technology in highly subjective ways, by running scans against police sketches of suspects, editing photos by combining facial features from different people, and in some cases even melding two faces into one. These practices—which are occurring largely in the dark, and absent any regulation—should concern every freedom-loving person. These reports underscore the urgent need for the Massachusetts state legislature to pass a moratorium to press pause on the government’s use of face surveillance technology.”
“In a free society, you shouldn’t be tracked by the government as you visit the doctor, seek mental health or substance use treatment, worship your religion, or attend a political meeting. But face surveillance enables persistent, constant tracking of not one person’s First Amendment protected activity, but every person’s—and not on just one day, but on all days,” said Kade Crockford, director of the ACLU of Massachusetts Technology for Liberty Program. “That’s exactly what is happening in Detroit and Chicago, and we need to make sure it doesn’t spread to the Bay State. Right here in Massachusetts, we’ve uncovered documents that show private companies are going to great lengths to use our communities as testing grounds for this very type of authoritarian surveillance. That’s why from San Francisco to Somerville, local governments are moving to ban the use of these technologies by government entities. Until the state legislature in Massachusetts adopts a moratorium, that’s the only way we can make sure the types of abuses laid out in Georgetown’s report aren’t occurring here in the Commonwealth.”
Photo: The NYPD is sometimes photoshopping random facial features onto suspect photos, and then searching those images using face recognition tech. Source: https://www.flawedfacedata.com/