In an effort to crack down on child sexual abuse material (CSAM), Apple has vowed to take more proactive measures. In summary they will:
Warn children and their parents when receiving or sending sexually explicit photos and blur those photos
New technology in iOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).
Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content
Siri will intervene in CSAM-related searches
Would you like to see technology be used towards other criminal activity?
Do you think this is overreach by Big Tech?
You can read their statement here: https://www.apple.com/child-safety/
NYTimes The Daily did a podcast on it: https://www.nytimes.com/2021/08/20/podcasts/the-daily/apple-iphones-privacy.html
submitted by /u/precordial_thump