An investigation by The Markup and Gizmondo reveals the extent of potential system bias issues within crime prediction software developed by PredPol. The specific bias pertained to the targeting of US neighbourhoods home to Blacks, Latinos, and low-income families rather than those with greater white or middle-to-upper-income residents. "Communities with troubled relationships with police — this is not what they need," American Civil Liberties Union Senior Policy Analyst Jay Stanley said. "They need resources to fill basic social needs."
In related news, theCanary published details about which UK police forces are using crime prediction software, including applications developed by PredPol.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.