Campaigners call on Apple not to scan iPhones for child sexual abuse

19/08/2021 | Independent

Privacy International, Big Brother Watch and Liberty join 90 campaign groups calling for Apple to abandon its plans to scan images on iPhones for images of child sexual abuse. Privacy and security experts are concerned that the system could be used to scan for other kinds of images. “Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the Letter.

 Apple also released a paper detailing the safeguards implemented in the system, The Verge reports that the tech company said it will not rely on a single government-affiliated database to identify illicit images. 

Read Full Story
Apple

What is this page?

You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.

The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.

Freevacy has been shortlisted in the Best Educator category.
The PICCASO Privacy Awards recognise the people making an outstanding contribution to this dynamic and fast-growing sector.