
Background information
Apple NeuralHash vs. privacy – Pandora’s box is opened
by Dominik Bärlocher
Almost as if they never existed: Apple has ostensibly deleted all mentions of the child porn detection tool on their website. In the background, however, NeuralHash and CSAM Detection continue to exist.
The latest update to Apple's operating system did not include a highly controversial feature. The iPhone doesn't scan all your pictures before uploading them to iCloud to automatically report if there's child pornography after all.
Now Apple has deleted all mention of the child porn detection tool, which Apple calls "CSAM Detection," from its website, writes industry portal heise.de. That's not entirely true. Apple still maintains a dedicated page on which it explains measures for child protection. That's where the explanation of CSAM Detection used to be. This has now disappeared. But the issue doesn't seem to be completely off the table. If you use a search engine with the search term "NeuralHash" on apple.com, many documents come up.
NeuralHash, as implemented at the time, is called CSAM Detection. The function came under criticism after it was announced. It circumvented the encryption with which Apple actually promised more privacy. Because it is too easy to bypass said encryption with NeuralHash or to generate false positives. Apple initially defended itself against all accusations and admitted to communication errors. Little help then came from an internal memo last August from partner organization NCMEC, which called all critics a "screeching minority."
As a result, Apple delayed the rollout of the backdoor in September 2021. Since then, the website said that criticism would be collected and analyzed over the "coming months." That notice has also disappeared.
Journalist. Author. Hacker. A storyteller searching for boundaries, secrets and taboos – putting the world to paper. Not because I can but because I can’t not.