Meta has created a taskforce to investigate claims made in a report published by the Stanford Internet Observatory (SIO) that Instagram is hosting and promoting the sale of self-generated child sexual abuse material (SG-CSAM).
According to the SIO, Instagram is “currently the most important platform for these networks”, and the platform's features, such as recommendation algorithms and direct messaging, help connect buyers with sellers.
Instagram's features allow users to search for terms that its algorithms attribute with SG-CSAM. The platform shows a pop-up warning explaining the results "may contain images of child sexual abuse” but presents users with an option to "see results anyway”. Instagram has since disabled this option.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 4,250 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.