Apple recently introduced a new technology to spot child sexual abuse material (CSAM), but it’s getting more criticism than praise from the privacy community.
Although Apple has been previously hailed as one of the only Big Tech companies that actually care about user privacy, the new CSAM-scanning technology introduced last week is putting a big wrench in that. Experts say even though Apple promises user privacy, the technology will ultimately put all Apple users at risk.
“Apple is taking its step down a very slippery slope; they have fleshed out a tool which is at risk for government back doors and misuse by bad actors,” Farah Sattar, the founder and security researcher at DCRYPTD, said to Lifewire in an email interview.