The news was met with some resistance from some security experts and privacy advocates, but also users who are accustomed to Apple’s approach to security and privacy that most other companies don’t have.Īpple is trying to calm fears by baking in privacy through multiple layers of encryption, fashioned in a way that requires multiple steps before it ever makes it into the hands of Apple’s final manual review. News of Apple’s effort leaked Wednesday when Matthew Green, a cryptography professor at Johns Hopkins University, revealed the existence of the new technology in a series of tweets.
Apple said its new CSAM detection technology - NeuralHash - instead works on a user’s device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks to verify the content are cleared.