How AI is being used in police stations to help draft reports

By Clare Duffy, Emily Williams, CNN
Fort Collins, Colorado (CNN) — In his nine years at the Fort Collins, Colorado, police department, Officer Scott Brittingham says he has taken a lot of pride in the process of writing reports after each call for service.
But when the department decided to test a tool to speed things up, he was intrigued. Now, a report that might have previously taken him 45 minutes to write takes just 10 minutes.
“I was a little bit skeptical, I’m not a big technology person,” Brittingham said in a March interview at the Fort Collins police station for CNN’s Terms of Service podcast. But spending less time writing reports means Brittingham can “take more calls for service” and “be proactive in preventing crime,” he said.
Brittingham is referring to Draft One, artificial intelligence-powered software that creates the first draft of police reports, aiming to make the process faster and easier. And his experience may increasingly become the norm for police officers as departments across the country adopt the tool. It’s gaining traction even as some legal experts and civil rights advocates raise concerns that AI-drafted police reports could contain biases or inaccuracies, as well as presenting potential transparency issues.
Axon — the law enforcement tech company behind the tool that also makes tasers and body cameras — said Draft One has been its fastest growing product since it launched last year. And Axon isn’t the only player in this industry; law enforcement tech company Truleo makes a similar AI police report tool called Field Notes.
Police reports sit at the heart of the criminal justice process — officers use them to detail an incident and explain why they took the actions they did, and may later use them to prepare if they have to testify in court. Reports can also inform prosecutors, defense attorneys, judges and the public about the officer’s perspective on what took place. They can influence whether a prosecutor decides to take a case, or whether a judge decides to hold someone without bond, said Andrew Guthrie Ferguson, an American University law professor who studies the intersection of technology and policing.
“Police reports are really an accountability mechanism,” Ferguson said. “It’s a justification for state power, for police power.”
For that reason, proponents of Draft One tout the potential for AI to make reports more accurate and comprehensive, in addition to its time-saving benefits. But skeptics worry that any issues with the technology could have major ramifications for people’s lives. At least one state has already passed a law regulating the use of AI-drafted police reports.
Draft One’s rollout also comes amid broader concerns around AI in law enforcement, after experiments elsewhere with facial recognition technology have led to wrongful arrests.
“I do think it’s a growing movement. Like lots of AI, people are looking at how do we update? How do we improve?” Ferguson said of AI police report technology. “There’s a hype level, too, that people are pushing this because there’s money to be made on the technology.”
An efficiency tool for officers
After an officer records an interaction on their body camera, they can request that Draft One create a report. The tool uses the transcript from the body camera footage to create the draft, which begins to appear within seconds of the request. The officer is then prompted to review the draft and fill in additional details before submitting it as final.
Each draft report contains bracketed fill-in-the-blanks that an officer must either complete or delete before it can be submitted. The blank portions are designed to ensure officers read through the drafts to correct potential errors or add missing information.
“It really does have to be the officer’s own report at the end of the day, and they have to sign off as to what happened,” Axon President Josh Isner told CNN.
Draft One uses a modified version of OpenAI’s ChatGPT, which Axon further tested and trained to reduce the likelihood of “hallucinations,” factual errors that AI systems can randomly generate. Axon also says it works with a group of third-party academics, restorative justice advocates and community leaders that provide feedback on how to responsibly develop its technology and mitigate potential biases.
The idea for Draft One came from staffing shortages that Axon’s police department clients were facing, Isner said. In a 2024 survey of more than 1,000 US police agencies, the International Association of Chiefs of Police found that agencies were operating at least 10% below their authorized staffing levels on average.
“The biggest problem in public safety right now is hiring. You cannot hire enough police officers,” Isner said. “Anything a police department can adopt to make them more efficient is kind of the name of the game right now.”
Axon declined to say how many departments currently use Draft One, but police have also adopted it in Lafayette, Indiana; Tampa, Florida; and Campbell, California. And given that “almost every single department” in the United States uses at least one Axon product, according to Isner, the growth potential for the product appears high.
In Fort Collins, Technology Sergeant Bob Younger decided to test Draft One last summer after seeing a demo of the tool.
“I was blown away at the quality of the report, the accuracy of the report and how fast it happened,” he said. “I thought to myself, ‘This is an opportunity that we cannot let go.’”
The department initially made the technology available to around 70 officers; now all officers have access. Younger estimates the tool has reduced the time officers spend writing reports by nearly 70%, “and that’s time we can give back to our citizens,” he said.
‘Radical transparency is best’
Isner said he’s received largely positive feedback from prosecutors about Draft One.
But last September, the prosecutor’s office in King County, Washington, said it would not accept police reports drafted with the help of AI after local law enforcement agencies expressed interest in using Draft One. The office said using the tool would “likely result in many of your officers approving Axon drafted narratives with unintentional errors in them,” in an email to police chiefs.
An Axon spokesperson said that the company is “committed to continuous collaboration with police agencies, prosecutors, defense attorneys, community advocates, and other stakeholders to gather input and guide the responsible evolution of Draft One.” They added that the AI model underlying Draft One is “calibrated … to minimize speculation or embellishments.”
But King County prosecutors aren’t the only ones concerned about errors or biases in AI-drafted police reports.
“When you see this brand new technology being inserted in some ways into the heart of the criminal justice system, which is already rife with injustice and bias and so forth, it’s definitely something that we sit bolt upright and take a close look at,” said Jay Stanley, a policy analyst with the ACLU Speech Privacy and Technology Project, who published a report last year recommending against using Draft One.
Even Ferguson, who believes the technology will likely become the norm in policing, said he worries about mistakes in transcripts of body camera footage impacting reports.
“The transcript that you get, which becomes a police report, might be filled with misunderstandings, because the algorithm didn’t understand, like, a southern accent or a different kind of accent,” Ferguson said. He also added that nonverbal cues — for example, if a person nodded rather than saying “yes” out loud — might not be reflected.
Axon tries to prevent errors or missing details with those automatic blank fields. However, in a demo at the Fort Collins Police Department, CNN observed that it is possible to delete all of the prompts and submit a report without making any changes. And once a report is submitted as final, the original, AI-generated draft isn’t saved, so it’s not possible to see what an officer did or didn’t change.
Axon says that’s meant to mimic the old-school process where, even if an officer was writing by hand, their drafts wouldn’t be saved along with their final report. The company also offers an opt-in setting that lets police departments require a certain percentage of the report be edited before the draft is submitted.
And then there’s the question of transparency, and whether a defendant might know the police report in their case was drafted by AI.
Final reports created with Draft One include a customizable disclaimer by default, noting that they were written with the help of AI, but departments can turn that feature off. The Fort Collins Police Department does not include disclaimers, but officers are incentivized to make reports their own and ensure their accuracy, Younger said.
“What an officer is worried about is being critiqued or held responsible for an error or doing something and being inaccurate,” he said. “Officers are super hyper-focused on the quality and quantity of their work.”
But Ferguson said he believes “radical transparency is the best practice.” In Utah, state lawmakers passed a law earlier this year that requires police departments to include that disclaimer on final reports that were drafted by AI.
Ultimately, like so many other applications of AI, Draft One is a tool that relies on responsible, well-meaning users.
“My overall impression is that it’s a tool like anything else,” Brittingham said. “It’s not the fix. It’s not replacing us writing reports. It’s just a tool to help us with writing reports.”
The-CNN-Wire
™ & © 2025 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.