A LEGISLATIVE COMMISSION is recommending strict new curbs on the use of facial recognition technology by the police, including requiring a warrant and limiting its use to investigating felony crimes. 

“Because the technology has advanced, I think there’s a growing awareness of how often mass surveillance exists…and wanting to make sure the government is providing reasonable restrictions and regulations about how it’s used,” said Sen. Jamie Eldridge, an Acton Democrat who co-chaired the commission along with Rep. Michael Day, a Stonham Democrat. 

But police and prosecutors who sat on the commission say the recommendations go too far in restricting law enforcement. Norwood Police Chief William Brooks, a commission member, said no one is being protected by requiring a police officer to get a warrant just to run a suspect’s picture through a facial recognition database. “Facial recognition is analogous to an anonymous tip line,” Brooks said. “It’s just a lead. All it does is focus the police in a particular direction.” 

Facial recognition technology has emerged in recent years as a new tool for law enforcement to investigate crimes. At its core, it is technology that analyzes images of human faces. The police can use it to identify someone by matching an image with a database, like the RMV driver’s license database. They can potentially use it to track an individual’s activities, by finding their faceprints on a network of surveillance cameras. 

Law enforcement agencies say the technology can help identify suspects involved in serious crimes. It can be used in counterterrorism cases and to search for missing children. Brooks said it is often used for things like identifying a suspect in a theft, based on pictures from surveillance camera footage. 

But organizations like the ACLU are raising concerns that facial recognition technology infringes on privacy rights. Several studies have also found problems with the accuracy of the technology. A study by MIT researchers found that people with darker skin were more likely to be misidentified by facial recognition technology than people with lighter skin, and women were more likely to be misidentified than men.  

In Massachusetts, a survey distributed by the commission found that of 156 police departments and nine district attorney’s offices who responded, 80 percent, or 131 agencies, said they do not use facial recognition technology and have no plans to. Only nine currently use the technology, 11 used it previously, and 12 plan to use it. The commission sent a follow-up survey to those agencies who had used the technology. It found that none established any standards or guidelines for its use, which the commission called “alarming and problematic.” Eight municipalities – including Boston and Springfield – have banned or restricted the use of facial recognition technology by municipal government or police. 

Massachusetts has been among the leading states in regulating the use of facial recognition technology. The 2020 police reform law required the police to get court approval before conducting a facial recognition search, with a showing that the police had reasonable grounds to believe it was tied to a criminal investigation or would mitigate a risk of harm. There were exceptions in cases of emergency or to identify a dead person. Police requests to use databases maintained by the RMV, FBI, or State Police must be made in writing, and agencies must report to the state on how the technology is used. 

Lawmakers attempted to ban the police from using facial recognition technology except with a warrant in emergency circumstances, but Gov. Charlie Baker, a Republican, refused to sign that section. The final law reflected a compromise, and created the 21-member commission to study further regulation. 

The commission’s report made 13 recommendations that would place significant safeguards on how the technology can be used.  

Fifteen commissioners voted in favor of the recommendations. Four dissented, and two abstained. The two police chiefs serving on the commission – Brooks and Gloucester Chief Edward Conley – and the one District Attorney, Cape and Islands DA Michael O’Keefe, were among the dissenters.  

The biggest shift from current law is that the commission recommended that facial recognition software only be used to investigate felonies, and only with a court warrant that is based on probable cause that a person has committed a felony. There would be exceptions for emergencies.  

Kade Crockford, technology for liberty program director at the ACLU of Massachusetts and a member of the commission, said the probable cause warrant for searches is “the gold standard of American justice.”  

Day compared the invasive nature of facial surveillance to wiretapping, which requires a warrant. “We want to make sure there are checks and balances in place so we can balance the need for criminal investigation with individual rights,” Day said. 

Eldridge said the commission did not want to let the police use intrusive technology for misdemeanor crimes that could be as minor as auto violations. 

But Brooks said the technology is often helpful for misdemeanor crimes – for example, to identify a suspect who stole an iPhone from a porch or a lawn mower from a store, if that suspect is caught on a surveillance camera. A crime as serious as assault and battery can be a misdemeanor. 

In a letter to the commission chairs, O’Keefe worried that the recommendations would hamstring the police too broadly, and wrote that limiting its use to felonies “leaves the investigation of significant criminal conduct unaided by this tool.”  

Brooks and O’Keefe argued that the use of facial technology is only one step toward investigating a crime and no one is arrested based solely on a facial recognition match. “A ‘match,’ like a ‘tip,’ merely provides a clue for investigators to follow in the same way they would the tip,” O’Keefe wrote.  

Brooks added that search warrants are used in places where someone has an expectation of privacy, like their home, and requiring a warrant just to run the photo through a database is “overreaching.”  

The commission also recommends banning the use of facial recognition for surveillance or tracking or for “emotion recognition,” a type of search meant to identify emotions like aggression, all of which the report says are “nascent, overreaching technologies with low reliability.”  

Eldridge said the goal of that section is to prevent the police from parking near a protest or in a neighborhood and using facial recognition technology to identify people with outstanding warrants who have nothing to do with the crime being investigated. Crockford added that this type of technology is used by authoritarian governments to track people’s movements using surveillance cameras. 

We got a pretty good sense law enforcement isn’t using that, but we wanted to be clear that they shouldn’t,” Day said.  

The commission also wants to create a centralized process, in which a municipal law enforcement agency would have to make a request to the State Police, and the State Police would then run the facial recognition search, or request the FBI run a search. They could only use certain types of facial recognition software approved by the Executive Office of Technology Services and Security, a restriction meant to ensure the software is as accurate as possible.  

Any evidence obtained by using facial recognition technology outside the parameters of the law would be inadmissible in court. A defendant would have to be notified if facial recognition were used in their case. 

It will be up to legislative leadership whether to consider the recommendations. Day said the Judiciary Committee, which he and Eldridge chair, will consider carefully all the recommendations that came out of the various commissions formed by the criminal justice and police reform laws. “We don’t intend to have reports put up on a shelf and collect dust,” he said.