POLICE OFFICERS have greeted every technical innovation in law enforcement with suspicion, believing in each case that management had an ulterior motive.
When Boston officers were first assigned personal two-way radios in the 1970s, patrol sergeants would distribute them at roll call and collect them at the end of each shift. They did so to prevent the radios from ending up in places where they could do less good, like Boston Harbor.
All technological improvements, from the introduction of the blue callbox on street corners to the automobile to onboard computers in the cars, faced the same skepticism. Officers have suspected in every instance that mechanical innovations somehow would be used to snoop, to manufacture evidence of rule breaking, for the purpose of disciplining officers unjustly.
A similar stance met the wave of civil liberties reforms from the 1950s to the 1970s. Many officers feared a series US Supreme Court rulings, especially on Fourth Amendment cases bolstering rights against improper search and seizure, would “handcuff” investigators.
In each case, the new systems and the new rules were integrated into police practice and improved police effectiveness and officer safety. One cannot imagine a patrol officer agreeing to start a shift today without the working two-way radio her grandfather would have tossed into the sea. Reforms forecast to stymie detectives have contributed to more effective investigations.
The use of artificial intelligence in policing may be the next tech innovation that promises both progress and controversy.
One area in which the software could make a major positive contribution is in helping to improve police decision making. Specifically, it could help police make effective use of the hundreds of petabytes (a petabyte is a million gigabytes) of video footage captured by police body cams. Use of these recordings would enable effective after-action reviews of officer decision-making on the street.
These reviews, conducted with the aim of improving judgment, could be a big boost to officer’s skills and competencies. Without the assistance of AI, the volume of records would be overwhelming. As it is, use of the recordings is generally limited to serving as evidence when complaints or questions about police conduct arise. That’s important for accountability, but we lose the great potential for broader improvements.
When we talk about police practice, performance, and behavior, we are talking about an officer’s judgment. Making consequential decisions is the characteristic activity of the police officer. Every step taken starts in the officer’s brain, which, in the end, is the most important “technology” officers use.
AI analysis of body cam footage would provide an unprecedented boost to help officers manage those brains. Decisions are shaped by many factors, including but not limited to the officer’s personality, training, biases, professional and personal experiences, and levels of fatigue and stress.
How do these factors interact in shift-to-shift decision making? Film study aided by AI would help officers answer those questions and improve their brain work. We can go from using body cam footage in the gravest cases to utilizing it in a much more far-reaching way to try to improve the way police do their job and interact with the public. Every police encounter with the public is a potential teachable moment. The objective is to help officers better understand the foundations and consequences of the choices they make.
Rarely do departments examine the decision making that takes place in the thousands of encounters an officer experiences each year. We do not ask what an officer believed were the key factors at the moment a decision had to be made. We don’t examine reasoning behind what the officer believed they were choosing. And that is just with actions that require some sort of report. Departments seldom record, and officers are seldom asked to explain, decisions not to take official action in a situation.
For example, one could learn about how officers tend to respond to angry people; to people of different ethnicities and skin colors; to people of different genders and sexual preferences; to the older and younger; to loud or quiet people. Every human brings cognitive biases to bear in their encounters with others. A proper analysis of performance can identify unconscious biases.
Officers can also learn how fatigue and stress influence judgment. They might see evidence that suggests they are better at the beginning of a shift and/or work week than at the end. When one works in a profession in which choices can have great consequences, such an assist in examining those choices can be of critical value. Of course, it will take time for officers to develop trust in the process. Candor, and thereby the utility of the initiative, will grow as trust grows
Such a use of AI also would enhance the chances of early intervention with officers who are headed in the wrong direction. The program would enhance individual and departmental accountability for conduct. Many departments use complaints as part of “red flag” systems, intervening with habitual transgressors before incidents escalate, rather than waiting for “serious” misconduct to occur. With AI, police leadership can identify problems before they turn into tragedies.
AI technology is already being tapped to enhance police practices. In its long-term strategic plan, the Paterson, New Jersey, police department adopted the accountability aspect of an AI deployment. According to the department plan, “The technology automatically detects critical events such as uses of force, pursuits, frisks, and non-compliance incidents, and screens for both professional and unprofessional officer language so supervisors can then review officers’ conduct.” The plan goes on to say the Paterson is the largest police department in the Northeast using AI technology to “assist in ensuring standardization and accountability in review of police body worn camera video.”
Use of the AI assist will require a new commitment to professional development for those in the first-line service level in the department. Police departments never have adopted after-action style improvement debriefings for patrol and detective operations (though specialized units used them so for years). For the first time in history, departments will need to commit to the improvement of bread-and-butter operations, where the overwhelming volume of police service is created and delivered.
AI-produced evidence of undesirable behavior patterns could drive behavior change. With guidelines in place, departments could create learning squads as the settings for implementation.
The squads could be composed of three officers, moderated by a supervisor. For purposes of team cohesion, teams should be made up of officers and supervisors from the same shifts and the same station houses (where there are multiple district stations). Each squad would meet monthly, at first, to review the analyses and adopt changes based on the new learning from the AI-generated analyses of each member’s performance. Squad members would review and share feedback on each other’s cases. Officers could examine and explain their reasoning.
Between the AI-generated analysis and the responses from team members, officers can learn from their own experiences. Every six months, they could meet for a one-hour session to go over each other’s decision-making patterns, aided by the AI-generated pattern analysis.
Introduction of this concept will require discussions and probably some good-faith, hard-nosed work at the bargaining table to build trust and transparency. Departments who would adopt this innovation must safeguard the civil liberties of persons recorded in interactions with police.
Transparency is critical with the community as well as with officers. Police leadership would do well to involve community leaders and activists in civil rights protection, such as the ACLU and Lawyers for Civil Rights, in the development process.
Improvements in police technology and due process have been met with skepticism over the years. In almost every case, however, the changes have produced improved officer safety, improved community safety, and enhanced justice. Artificial Intelligence may be the next chapter in this story.
Jim Jordan is the retired director of strategic planning at the Boston Police Department. He has taught police strategy at Northeastern University, the University of Massachusetts Lowell, and in training settings around the country.

CommonWealth Voices is sponsored by The Boston Foundation.
The Boston Foundation is deeply committed to civic leadership, and essential to our work is the exchange of informed opinions. We are proud to partner on a platform that engages such a broad range of demographic and ideological viewpoints.