Apple May Soon Bring a Tool to Scan Your iPhone Photos for Child Abuse Content



Apple is reportedly developing a tool that would scan for child sexual abuse material (CSAM) in your iPhone photos using hashing algorithms. The system is said to be deployed on the user’s device for greater security and privacy.



Source link