What Can Go Wrong When Police Use AI to Write Reports?

What Can Go Wrong When Police Use AI to Write Reports?

Share and Enjoy !


by Matt Agorist, The Free Thought Project:

(Electronic Frontier Foundation) Axon—the makers of widely-used police body cameras and tasers (and that also keeps trying to arm drones)—has a new product: AI that will write police reports for officers. Draft One is a generative large language model machine learning system that reportedly takes audio from body-worn cameras and converts it into a narrative police report that police can then edit and submit after an incident. Axon bills this product as the ultimate time-saver for police departments hoping to get officers out from behind their desks. But this technology could present new issues for those who encounter police, and especially those marginalized communities already subject to a disproportionate share of police interactions in the United States.

TRUTH LIVES on at https://sgtreport.tv/

Responsibility and the Codification of (Intended or Otherwise) Inaccuracies

We’ve seen it before. Grainy and shaky police body-worn camera video in which an arresting officer shouts, “Stop resisting!” This phrase can lead to greater use of force by officers or come with enhanced criminal charges. Sometimes, these shouts may be justified. But as we’ve seen time and again, the narrative of someone resisting arrest may be a misrepresentation. Integrating AI into narratives of police encounters might make an already complicated system even more ripe for abuse.

The public should be skeptical of a language algorithm’s ability to accurately process and distinguish between the wide range of languages, dialects, vernacular, idioms and slang people use. As we’ve learned from watching content moderation develop online, software may have a passable ability to capture words, but it often struggles with content and meaning. In an often tense setting such as a traffic stop, AI mistaking a metaphorical statement for a literal claim could fundamentally change how a police report is interpreted.

Moreover, as with all so-called artificial intelligence taking over consequential tasks and decision-making, the technology has the power to obscure human agency. Police officers who deliberately speak with mistruths or exaggerations to shape the narrative available in body camera footage now have even more of a veneer of plausible deniability with AI-generated police reports. If police were to be caught in a lie concerning what’s in the report, an officer might be able to say that they did not lie: the AI simply mistranscribed what was happening in the chaotic video.

Read More @ TheFreeThoughtProject.com

Originally Poated at https://www.sgtreport.com

You May Also Like

More From Author