Apple confirms that it has stopped plans to roll out CSAM detection
source link: https://9to5mac.com/2022/12/07/apple-confirms-that-it-has-stopped-plans-to-roll-out-csam-detection-system/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
Apple confirms that it has stopped plans to roll out CSAM detection
Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.
Apple will no longer scan for CSAM in iCloud Photos
On the same day that the company announced Advanced Data Protection with end-to-end encryption for all iCloud data, it also put an end to the never-released CSAM scan. The news was confirmed by Apple’s vice president of software engineering Craig Federighi in an interview with WSJ’s Joanna Stern.
When the CSAM scan was announced, Apple said that iCloud Photos would be able to detect inappropriate material in users’ photos based on a database of CSAM image hashes. That way, the company would be able to detect such photos using on-device processing without ever having to see users’ photos.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes, without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match.
Even so, the CSAM scan resulted in a lot of criticism from users. In a statement to 9to5Mac last year, the company said it chose to “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
It turns out that, Apple has now decided that it would be better to put an end to CSAM scanning on iCloud Photos. It’s worth noting, however, that other child safety features such as restrictions on iMessage are still available in iOS.
What are your thoughts on this decision? Let us know in the comments below.
Guides
About the Author
Filipe Espósito
Filipe Espósito is a Brazilian tech Journalist who started covering Apple news on iHelp BR with some exclusive scoops — including the reveal of the new Apple Watch Series 5 models in titanium and ceramic. He joined 9to5Mac to share even more tech news around the world.
Recommend
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK