A new joint study by the DSA Civil Society Coordination Group (CSCG), the Recommender Systems Taskforce, and People vs Big Tech have criticised the first set of risk assessment reports required under the Digital Services Act (DSA) by designated Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).
The study argues that the reports fail to adequately address the actual harms and negative effects of platform operations concerning mental health. As such, the study reveals the reports do not sufficiently focus on risks arising from platform design, especially recommender systems. These systems, driven by engagement metrics, can amplify harmful content and contribute to issues like mental health problems and political polarisation. Instead of prioritising content moderation, they focus on design choices intended to increase user engagement, which can be a primary cause of harm. The study concludes that aspects need more thorough examination in future risk assessments.

What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.