A.G. mulls statewide policy on facial recognition technology

Reformers have rejected the policing tool as inaccurate and racist

By: - February 25, 2022 7:00 am

A video monitor displays attendees as their images are captured with CyperLink’s facial recognition during CES 2020 at the Las Vegas Convention Center on January 8, 2020 in Las Vegas. (Photo by David Becker/Getty Images)

Supporters of facial recognition technology say it can help law enforcement identify elusive criminals and find missing people. Critics counter it’s a privacy invasion that contributes to racist policing.

The New Jersey Attorney General’s Office now wants the public to weigh in. The agency has opened an online portal to collect public input officials say will help them shape statewide policy on the technology’s use in law enforcement. Public comment will be accepted until March 11.

There now is no policy, though former Attorney General Gurbir Grewal’s January 2020 order barring New Jersey police departments from using facial recognition app Clearview AI remains in place, spokesman Peter Aseltine said. Grewal had expressed general concerns about the software’s safeguards.

The office already has identified three guiding principles for the policy: ensuring technology companies don’t improperly collect personal images in violation of the rules of social media or other sites; permitting law enforcement agencies to use it only to generate leads when they have no suspects, rather than as the sole basis for arrest or prosecution; and allowing agencies to use it only for specific investigations, instead of ongoing “dragnet” surveillance.

While civil rights defenders and police reformers applaud officials for seeking community input, they say the state should reject the technology altogether, as places like Vermont, Maine, San Francisco, Portland, and Boston have done.

“This has the makings of some really Big Brother stuff, people being surveilled without their consent,” said activist Zayid Muhammad, a community organizer with Newark Communities for Accountable Policing.

Muhammad’s group and Black Lives Matter of Paterson have called on state officials to focus on police reform instead.

The technology uses biometrics to map facial features from people caught on camera, and then compares the results to a database of known faces to hunt for a match. In everyday life, it’s used to help people do things like unlock their phones and tag friends on social media.

But in policing, it has been controversial.

“It’s simply not actually a very reliable technology in law enforcement contexts,” said Dillon Reisman, a legal fellow with the American Civil Liberties Union of New Jersey. “They are often inherently biased against darker-skinned people of color, particularly women. They don’t work well in poor lighting and on grainy security surveillance-type images.”

Studies done by MIT and Stanford University in 2018 found some facial analysis algorithms misidentified Black women nearly 35% of the time — but mostly got it right for white men. A government study backed those findings up in 2019, concluding the technology performs poorly when examining the faces of women, people of color, the elderly, and children.

Such misidentifications have resulted in lawsuits, including one a Paterson man filed in 2020 after Woodbridge police arrested him in 2019, relying on facial recognition technology to wrongly link him to a vehicle assault. The man, Nijeer Parks, spent 10 days in jail even though he told police he didn’t have a driver’s license or car, had never been to Woodbridge, and had an alibi at the time the incident occurred. Charges were eventually dismissed.

Jennifer Sellitti is an attorney in the New Jersey Office of the Public Defender.

“People assume there’s no way bias can creep into technology. Yet everything from the way algorithms are programmed to the pool of people whose photos are going into the system have subjective components that cause problems,” Sellitti said.

“Take the overrepresentation of people of color in criminal justice photographs,” she added. “Black people are arrested at higher rates, they’re more likely to be included in those photos, and there are more likely to be multiple photos of them than, say, me.”

The technology also tends to be deployed most often in communities of color, contributing to biased policing, Reisman added.

Officers have also used it to spy on and arrest protesters, Reisman said.

“This represents a major chilling effect on people’s First Amendment rights,” he said. “Giving the police the ability to identify anyone at the push of a button is a serious threat on people’s abilities to live free lives publicly.”

In recent years, police in New Jersey and nationally have pledged to rebuild relationships with communities where over-policing and systemic racism have destroyed the public’s trust in law enforcement. Using facial recognition technology is not the way to do that, Muhammad said.

“This does nothing but anger a whole lot of folks who are just really, really tired of being traumatized by police abuse and this complete unwillingness to do anything about it,” he said.

Reisman agreed: “Surveillance tools like this are constantly putting the public on notice that they are the object of suspicion. That results in the breakdown of trust.”


Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.

Dana DiFilippo
Dana DiFilippo

Dana DiFilippo comes to the New Jersey Monitor from WHYY, Philadelphia’s NPR station, and the Philadelphia Daily News, a paper known for exposing corruption and holding public officials accountable. Prior to that, she worked at newspapers in Cincinnati, Pittsburgh, and suburban Philadelphia and has freelanced for various local and national magazines, newspapers and websites. She lives in Central Jersey with her husband, a photojournalist, and their two children.