41.3 F
Davis

Davis, California

Sunday, December 14, 2025

Police Accountability Commission members discuss potential use of AI in policing

The meeting featured a presentation by the Office of Independent Review Group 

 

By KATYA OKS — city@theaggie.org

 

On Nov. 3, the Davis Police Accountability Commission held a meeting to discuss the potential role of artificial intelligence (AI) in policing. The meeting was led by Michael Gennaco, a founding member of the Office of Independent Review (OIR) Group. The OIR Group collaborates with community members and local governments, as well as police agencies, to address oversight and accountability, internal investigations into misconduct and conduct training to ensure law enforcement is following up-to-date practices. 

Gennaco also served as the chief attorney for the OIR in Los Angeles County, with expertise in law enforcement reform. Gennaco began his presentation with a working definition of AI for the purpose of the topics he was set to discuss. He then reflected on the current state of policing and the criminal justice system through his work in the OIR Group. 

“In Santa Clara County, which we monitor, […] [there was] a murder in their jails in January this year [of an] incarcerated person,” Gennaco said. “Unfortunately, he was killed by [other inmates] and beaten to death. The assault […] went on for several minutes, and there was a camera in the dorm that captured all of it. […] No one was watching this assault happening in real time that led to this person’s death.” 

After describing the incident, Gennaco cited the lack of resources as a potential contributor to the event. 

“There are hundreds of cameras in the two jails in Santa Clara County, so there are not [enough] resources for a person to keep eyes on these cameras at all times,” Gennaco said. “AI may be able to fix that.”

Gennaco transitioned to discussing another issue concerning the nature of policing. He noted that the footage from officers’ body-worn cameras are crucial to investigations, evaluating complaints and more. 

“One of the things that we do as auditors when we are reviewing the way the police department has handled a complaint is [by] looking at the police report [and] the recordings of any interviews. We also look at any body-worn camera that the officers activated that showed them responding to the event,” Gennaco said. “[But because] there are so many interactions that the police are recording and upload[ing], […] at some point [the footage goes] away and no one ever looks at [it].”

Gennaco offered a proposal on how the integration of AI could potentially help to resolve this issue. Through training an AI model to sift through footage and watch out for “discourteous language, profane language, racial epithets” and “whether an officer is using force in the field,” according to Gennaco, the policing system can catch these instances more effectively.

 “AI can now do the work of sifting through all of this — tons of evidentiary material and identifying issues — and then the supervisor is able to take appropriate action,” Gennaco said. “The same can be done for good officer behavior as well [to create] positive reinforcement.”

The presentation also explored potential drawbacks of an overreliance on AI programs for policing. An example that Gennaco brought forth is AI face identification. 

“This is sort of a generalization, but generally, folks that are trans or LGBTQ+ tend to not always look the same like some of the rest of us: They are altering their hair, [and] a lot of other things that the robot is using to identify [faces],” Gennaco said. “And to some degree, that’s also true for women. Women tend to change their hairstyles more than men, for example, […] which can lead to misidentification.”

Cecilia Escamilla-Greenwald, a member of the Police Accountability Commission, brought up a concern regarding the topic. 

“The misidentification of potential women in particular minorities is a big concern,” Escamilla-Greenwald said. “If [law enforcement officials] are in fact going to use AI, […] they have a lot of work to do [to improve] in that area.”

Gennaco also brought up the use of police drones and the potential integration of AI software into them. Escamilla-Greenwald brought forth another potential concern.

“What if [law enforcement officials] call drones to go after protesters?” Gennaco said. “Protesting is a right, and to go after people who are peacefully protesting [is dangerous] […] If they start using AI and drones, rights will be stripp[ed] away, even more so than the [Trump] administration is already doing.”

Angela Willson, a chairperson on the Police Accountability Commission, offered another perspective on the use of drones and AI technology. 

“On Picnic Day [last April], having drones would have been beneficial in being able to find out who the shooters are,” Willson said, referring to the incident that left two teenagers injured at Davis Community Park on April 12. 

Gennaco agreed with both sentiments, reflecting on what the mindset should be in developing such equipment. 

“The question is: Even if you tolerate the technology for its good, what guardrails can be placed to ensure that this technology is doing what you want it to do?” Gennaco said. 

No immediate decision was reached by the end of the meeting, with the committee deciding that more research should be conducted before implementing AI in the Davis Police Department. 

 

Written By: Katya Oks — city@theaggie.org