A former rocket scientist recently spoke before the UK Parliament and used a powerful metaphor to emphasise the significance of quality data in the generative AI debate. Peter Waggett, the UK research director for IBM, shared how he once utilised the ozone layer as a calibration constant until a hole was discovered, which completely altered their viewpoint. This experience taught him the importance of comprehending the data being inputted into a system and not just accepting it at face value. Waggett admitted that he had missed this crucial detail and wondered why he didn't notice it earlier. It was revealed that the database had assumed that if the data wasn't constant, it had to be incorrect and should be discarded. This taught him the valuable lesson of understanding what is happening behind the scenes.
What is this page?
You are reading a summary article on the Privacy Newsfeed, a free resource for DPOs and other professionals with privacy or data protection responsibilities helping them stay informed of industry news all in one place. The information here is a brief snippet relating to a single piece of original content or several articles about a common topic or thread. The main contributor is listed in the top left-hand corner, just beneath the article title.
The Privacy Newsfeed monitors over 300 global publications, of which more than 5,750 summary articles have been posted to the online archive dating back to the beginning of 2020. A weekly roundup is available by email every Friday.