menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

An immigration agent’s use of ChatGPT for reports is raising alarms. Experts explain why

1 0
previous day

Tucked in a two-sentence footnote in a voluminous court opinion, a federal judge recently called out immigration agents using artificial intelligence to write use-of-force reports, raising concerns that it could lead to inaccuracies and further erode public confidence in how police have handled the immigration crackdown in the Chicago area and ensuing protests.

U.S. District Judge Sara Ellis wrote the footnote in a 223-page opinion issued last week, noting that the practice of using ChatGPT to write use-of-force reports undermines the agents’ credibility and “may explain the inaccuracy of these reports.” She described what she saw in at least one body camera video, writing that an agent asks ChatGPT to compile a narrative for a report after giving the program a brief sentence of description and several images.

The judge noted factual discrepancies between the official narrative about those law enforcement responses and what body camera footage showed. But experts say the use of AI to write a report that depends on an officer’s specific perspective without using an officer’s actual experience is the worst possible........

© Fast Company