Recently, Apple has been questioned by many users because of a new feature that will be added to iOS 15. This is a kind of indecent photos that can detect children in the device.
Local algorithms including child abuse and non-ferrous products are intended to protect children.
However, this has caused everyone to worry about privacy issues. After all, the user’s album photos will be uploaded to the cloud and viewed by the back-end personnel, triggering a round of heated discussions.
Apple once explained that they will not scan the owner’s entire photo library to find child abuse pictures, but will use cryptography to compare the pictures with the known database provided by the National Center for Missing and Exploited Children in the United States. However, some There are still some anxiety and dissatisfaction among users.
Craig Federighi, Apple’s senior vice president of software engineering, defended the company’s controversial child safety features in an interview with The Wall Street Journal today.
It revealed some new details about Apple’s safeguard measures for scanning child sexual abuse materials (CSAM) in user photo libraries.
Craig Federighi said again that the detection algorithm will be completed locally on the phone. For example, users need to match about 30 CSAM content in their photo library before Apple will be alerted.
And only when the user reaches the matching threshold of 30 known child pornographic images, Apple will know the user’s account and the status of these images.
It should be noted that Apple will not know the specific information of the pictures in the user’s album during the whole process, but the local algorithm will tell Apple that there may be related content in this phone after the comparison is passed.
In addition, users do not need to care about this issue, because Apple has made it clear that this system is only available in the United States and can only be enabled after the user turns on iCloud photos.