There’s been a lot of talk regarding Apple’s CSAM (Child Sexual Abuse Material) scanner. Now, the scanner is back in the news again, as it appears that hackers could be one step closer to tricking the CSAM scanner and creating false positives.
A Reddit user did some reverse engineering to understand Apple’s NeuralHash algorithm for on-device CSAM detection. In doing so, they discovered a possible collision in the hash that could create false positives. A collision is a potential clash that occurs when two pieces of data have the same hash value, checksum, fingerprint, or cryptographic digest.