Apple delays the rollout of scanning for child sexual abuse material (CSAM ) so it can collect feedback and make improvements.
Last month, Apple announced they would scan iMessage and iCloud photos for child abuse images. They immediately got a lot of backlash from privacy advocates and security researchers so now they announced they are backtracking. Here's the official announcement: "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features". There's no word on when the company plans to roll out the features. Link
Our previous update about Apple's plans to scan for CSAM.
Photo credit: Wikimedia