AI Now: Predictive policing systems are racist because corrupt cops produce dirty data

AI Now: Predictive policing systems are racist because corrupt cops produce dirty data

The Next Web

Published

The AI Now Institute’s Executive Director, Andrea Nill Sánchez, today testified before the European Parliament LIBE Committee Public Hearing on “Artificial Intelligence in Criminal Law and Its Use by the Police and Judicial Authorities in Criminal Matters.” Her message was simple: “Predictive policing systems will never be safe… until the criminal justice system they’re built on are reformed.” Sanchez argued that predictive policing systems are built with “dirty data” compiled over decades of police misconduct, and that there’s no current method by which this can be resolved with technology. Her testimony was based on a detailed study conducted by the…

This story continues at The Next Web

Full Article