The easiest way to get the most important tech news

We research all the latest tech news and every few days you'll get a short email with just the most important things you need to know.

We believe that less is more, our updates are short and succinct.
No fluff, just the important facts.

Subscribe Free

Sep 4, 2021

Apple delays the rollout of scanning for child sexual abuse material (CSAM ) so it can collect feedback and make improvements.
Last month, Apple announced they would scan iMessage and iCloud photos for child abuse images. They immediately got a lot of backlash from privacy advocates and security researchers so now they announced they are backtracking. Here's the official announcement: "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features". There's no word on when the company plans to roll out the features. Link
Our previous update about Apple's plans to scan for CSAM.
Photo credit: Wikimedia

Popular Updates


Blog Archive

Scroll To Top