Predictive policing systems are flawed because they replicate and amplify racism
The AI Now Institute’s Executive Director, Andrea Nill Sánchez, today testified before the European Parliament LIBE Committee Public Hearing on “Artificial Intelligence in Criminal Law and Its Use by the Police and Judicial Authorities in Criminal Matters.”
Her message was simple: “Predictive policing systems will never be safe… until the criminal justice system they’re built on are reformed.” Sanchez argued that predictive policing systems are built with “dirty data” compiled over decades of police misconduct, and that there’s no current method by which this can be resolved with technology.
Her testimony was based on a detailed study conducted by the AI Now Institute last year that detailed how predictive policing systems are inherently biased.
Source: AI Now: Predictive policing systems are flawed because they replicate and amplify racism