

Last year, Apple announced that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of Child Sexual Abuse Material (CSAM) image hashes. While Apple wouldn’t see these photos since it would use on-device processing, it generated a lot of criticism from privacy and security researchers.
Now, after announcing a new Advanced Data Protection for iCloud, the company’s executive Craig Federighi has confirmed that Apple will not roll out the CSAM detection system for iPhone users as the company has stopped developing it.
The post Apple stops developing CSAM detection system for iPhone users appeared first on BGR.
Today’s Top Deals
- Oops! 67 crazy Black Friday deals that Amazon forgot to end
- Best Apple Watch deals and sales for December 2022
- Today’s deals: $209 Apple Watch, free Echo Dot, Shark vacuum, IHOP gift card, LG OLED TVs, more
- Get an Echo Dot for $0.99 with this crazy Amazon promo
Trending Right Now: