A new report published by researchers at the Digital Future for Children centre at the London School of Economics and Political Science reveals that legislation and regulations are behind changes made by four providers of digital platforms and services to protect people's privacy and increase online safety.
The report identified 128 child safety and privacy changes made by Meta, Google, TikTok, and Snap between 2017 and 2024. The study highlights the influence of regulatory measures such as the Age-Appropriate Design Code (AADC) and the UK Online Safety Act 2023 (OSA) on these changes. Despite positive steps, the report identifies over-reliance on tools like parental controls, which have shown low efficacy. The researchers offer eleven recommendations to enhance child safety legislation, including working collaboratively across to develop industry best practices, regulations outlining what good practice looks like, mandatory access to data for child safety research and transparency around child safety changes.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.