Apple announced Friday that it will delay a suite of features aimed at limiting the spread of Child Sexual Abuse Material (CSAM) that had raised serious privacy concerns.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement to The Hill.
Two out of the three features had brought significant criticism.
One would alert parents if their children were sending or receiving sexually explicit images. The other would scan photos in a user’s iCloud for CSAM and report any infringing images to Apple moderators.
Apple would then report detected material to the National Center for Missing and Exploited Children, a national clearinghouse that works with law enforcement. (Read more from “Apple Delaying Plan To Scan Phones for Child Sex Abuse Images” HERE)
Delete Facebook, Delete Twitter, Follow Restoring Liberty and Joe Miller at gab HERE.