close
close

What you need to know

Police departments across the country are beginning to use artificial intelligence to help them complete their paperwork.

According to a report by the Associated Press, AI can help officials save time writing reports and do their jobs more efficiently.

But experts warn of the potential dangers of using new technologies: They could lead to serious, sometimes life-changing errors in reporting, promote bias and racism, and reveal private information, Politico reports.

A new AI product from technology company Axon called Draft One is designed to help police officers create reports in less time than traditional methods, according to the company's website.

“Today, officers spend up to two-thirds of their day on paperwork,” Axon's website states. “Our AI research team is working to reduce the time spent on paperwork by improving the efficiency and accuracy of reporting and information analysis in law enforcement.”

For example, Axon says its product gives police officers the ability to automatically redact footage from body cameras, “so that officers can share footage with the public more quickly while protecting the privacy of those captured on video.”

The AI ​​software allows supervisors to review footage and reports so they can “better understand compliance and provide feedback to improve training and police-community relations.”

Draft One has received the “most positive response” of all the products the company has brought to market, Axon founder and CEO Rick Smith told the Associated Press, but he added: “There are certainly concerns.”

District attorneys want to make sure police officers do not rely solely on an AI chatbot to prepare a report because they may have to testify in court about an alleged crime, Smith told AP.

“You never want to call an officer to the stand who says, 'The AI ​​wrote that, not me,'” Smith told the outlet.

An AP investigation into another crime-fighting AI program called ShotSpotter, a gunshot detection tool that, according to its website, uses “sensors, algorithms and artificial intelligence” to classify 14 million sounds in its database as gunshots or something else, found “serious flaws” in the technology.

In one case in which evidence from ShotSpotter was presented in court, an Illinois grandfather named Michael Williams was sentenced to more than a year in prison in 2021 on charges of shooting a man based on audio evidence that prosecutors said they obtained using the AI ​​tool, the AP reported in 2022.

As Williams' car drove through an intersection, the AI ​​technology reportedly picked up a loud noise and concluded that Williams had shot the passenger in his car, the AP reported.

However, Williams told authorities that a person in another vehicle pulled up next to his car and shot his passenger, the AP reported. A judge eventually dismissed the case because prosecutors said they did not have enough evidence to proceed.

“I kept trying to figure out how they could get away with using technology against me like that,” Williams told AP.

Want to stay up to date with the latest crime reports? Sign up for PEOPLEThe free True Crime newsletter offers breaking crime news, current court reports and details of exciting unsolved cases.

In a number of other applications used by lawyers, “AI creates 'hallucinations' or false cases, citations and legal arguments that appear correct but do not actually exist,” wrote an expert at the law firm Freeman, Mathis & Gary in an article on its website about risks and issues associated with the use of generative AI in the legal industry.

This includes the accuracy of chatbots, which are now being used by prosecutors and defense attorneys to comb through documents for relevant evidence, write legal opinions, and develop complex trial strategies. “A [2024]A Stanford University study found that 75% of responses to a test case generated by AI chatbots were incorrect,” the expert wrote.

“In addition, AI cannot adequately answer legal questions that affect more than one area of ​​law. For example, a legal question that affects both immigration law and criminal law may provide an accurate answer for immigration law purposes, but ignore all criminal law issues and implications.”

While the use of AI in policing could be helpful, precautions and safeguards must be taken, Jonathan Parham, former police chief in Rahway, New Jersey, told Politico.

“AI should never replace the officer – it should enhance his operational competence,” Parham told Politico.