Apple alters course on CSAM scanning over privacy concerns

16/08/2021 | Reuters

Following considerable objections and concerns about its plans to search iPhones for images of child sexual abuse, Apple has confirmed that it will now only scan for CSAM images flagged by clearinghouses in multiple countries. Researchers can check that the image identifiers are universal to prove that they cannot be adapted to target individuals. The company also added it would take 30 matched CSAM images before the system prompts Apple for a human review. 

Read Full Story
Apple

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.

Freevacy has been shortlisted in the Best Educator category.
The PICCASO Privacy Awards recognise the people making an outstanding contribution to this dynamic and fast-growing sector.