AI Police Reports Don’t Deliver Justice, They Automate Bias and Strip Away Accountability
- ural49
- Sep 30
- 2 min read

Artificial intelligence is creeping deeper into policing, and many in minority communities are alarmed. In Fort Collins, Colorado, Officer Scott Brittingham boasted that AI software Draft One slashed his report writing from 45 minutes to ten. “Spending less time writing reports means I can take more calls for service and be proactive,” he told CNN. Police reports shape prosecutions, bail decisions, and courtroom narratives. As law professor Andrew Guthrie Ferguson said, “Police reports are really an accountability mechanism… a justification for state power.” Turning that power over to an algorithm trained on biased data deepens injustice.
Draft One, created by Axon—the same company selling tasers and body cameras—transcribes body cam audio into draft reports. Officers can delete AI prompts and submit final reports without changes, and original drafts aren’t saved. That means a defendant can’t even check what the machine first generated. “Radical transparency is the best practice,” Ferguson warned. Yet Fort Collins police don’t even include disclaimers that AI helped write reports. King County prosecutors in Washington flatly rejected AI-drafted reports, warning of “unintentional errors.”
Axon claims to calibrate Draft One to avoid “hallucinations” and works with “community leaders,” but trust is earned through actions, not PR. As ACLU analyst Jay Stanley said, “When you see this brand-new technology being inserted into the heart of the criminal justice system, which is already rife with injustice and bias, it’s definitely something we take a close look at.” For Black and Brown communities, that “close look” is survival. We’ve seen facial recognition technology falsely identify and jail Black men. We’ve seen police body cams disappear when it’s convenient. Now, AI threatens to automate lies that can cost us freedom—or lives.
Link: CNN



Comments